Marco Rogers on Nostr: That's right. It's sort of the core issue in educating people about LLMs. They don't ...
That's right. It's sort of the core issue in educating people about LLMs. They don't "sometimes hallucinate". They always hallucinate. By design.
But I also think we should graduate from saying they "just" make things up. They have a sophisticated inner model that means they're more likely to hallucinate towards things that seem correct when read by humans. And studying how they are able to do that is actually interesting.
https://mastodon.social/@GeePawHill/112203183163246574
But I also think we should graduate from saying they "just" make things up. They have a sophisticated inner model that means they're more likely to hallucinate towards things that seem correct when read by humans. And studying how they are able to do that is actually interesting.
https://mastodon.social/@GeePawHill/112203183163246574