Bastian Allgeier on Nostr: Reading through the Github Copilot feedback made me realize that people report ...
Reading through the Github Copilot feedback made me realize that people report hallucination issues as if they were regular bugs in software.
People really think that it's all about just fixing those wrong answers somewhere in the backend of the LLM.
Published at
2024-10-02 08:39:03Event JSON
{
"id": "42ce269e61b47b01e9d6b13be5d6840797ad8cbc89c869bef34181a699abd404",
"pubkey": "70bfbf9a40f4ff5895ac02089687c492ed3deb2fac732b4535f79e300cb89ad5",
"created_at": 1727858343,
"kind": 1,
"tags": [
[
"proxy",
"https://mastodon.social/users/bastianallgeier/statuses/113236924377132517",
"activitypub"
]
],
"content": "Reading through the Github Copilot feedback made me realize that people report hallucination issues as if they were regular bugs in software. \n\nPeople really think that it's all about just fixing those wrong answers somewhere in the backend of the LLM.",
"sig": "25928927ec27609e896be78977d902ea81223e37d3b1b6bea3cc1f0d914622ca4c7ddfcf0b632ad81572a2d855b74593dc3cd22c76b9da577f32cad4df24a232"
}