Hidde on Nostr: “LLMs cannot learn all of the computable functions and will therefore always ...
“LLMs cannot learn all of the computable functions and will therefore always hallucinate. Since the formal world is a part of the real world which is much more complicated, hallucinations are also inevitable for real world LLMs.”
New paper that shows hallucination problem not fixable
https://arxiv.org/abs/2401.11817Published at
2024-03-01 10:08:07Event JSON
{
"id": "05c31530ee106c579a71045b190faf77612c4435e78d66d147bd0618fd963ed0",
"pubkey": "8cc47ed4727396e063acfebc190c3d9c069edc7c8b18076a6d25ac816c04de8a",
"created_at": 1709287687,
"kind": 1,
"tags": [
[
"proxy",
"https://front-end.social/users/hdv/statuses/112019877890370451",
"activitypub"
]
],
"content": "“LLMs cannot learn all of the computable functions and will therefore always hallucinate. Since the formal world is a part of the real world which is much more complicated, hallucinations are also inevitable for real world LLMs.”\n\nNew paper that shows hallucination problem not fixable https://arxiv.org/abs/2401.11817",
"sig": "08abc9e5d7b1bf393ae686340b0cf04ebeab16a342d080319871c20fd554ffdea7409705dc0ec0082faaf59561cbc2f23a34a2bb5771fefbc8fce780f828d55f"
}