Baldur Bjarnason on Nostr: RAG, which was supposed to fix hallucinations, essentially requires feeding query ...
RAG, which was supposed to fix hallucinations, essentially requires feeding query results directly into the LLM
Like some of us have been saying for a while now, these systems are simply not suitable for purpose and are inherently flawed. Worse yet, they are fundamentally insecure and prone to manipulation
If you think spam's a problem now, imagine a world where every spammer has sophisticated tools to control how THIRD PARTY chatbots talk about any given product.
Like some of us have been saying for a while now, these systems are simply not suitable for purpose and are inherently flawed. Worse yet, they are fundamentally insecure and prone to manipulation
If you think spam's a problem now, imagine a world where every spammer has sophisticated tools to control how THIRD PARTY chatbots talk about any given product.