Matt Wilcox on Nostr: The problem is that LLMs are an untrustworthy-data-generator, not just data retreval. ...
The problem is that LLMs are an untrustworthy-data-generator, not just data retreval. In the past *humans* wrote things, and most of the time those humans actually checked things before writing them. It was their job. The search just brought source material to you.
LLMs will generate 1000 made up ones at the drop of a hat.
It's the *saturation* that's the danger. Especially when they feed on thier own output. It becomes ever less trustworthy.
Callionica (npub1emh…q9c5) Scott Jehl (npub133e…rqrd)
LLMs will generate 1000 made up ones at the drop of a hat.
It's the *saturation* that's the danger. Especially when they feed on thier own output. It becomes ever less trustworthy.
Callionica (npub1emh…q9c5) Scott Jehl (npub133e…rqrd)