What is Nostr?
Marco Rogers /
npub1pk7…y877
2024-04-02 19:29:01

Marco Rogers on Nostr: That's right. It's sort of the core issue in educating people about LLMs. They don't ...

That's right. It's sort of the core issue in educating people about LLMs. They don't "sometimes hallucinate". They always hallucinate. By design.

But I also think we should graduate from saying they "just" make things up. They have a sophisticated inner model that means they're more likely to hallucinate towards things that seem correct when read by humans. And studying how they are able to do that is actually interesting.
https://mastodon.social/@GeePawHill/112203183163246574
Author Public Key
npub1pk75jpjz8jgu7tjn4et353u56qyk9ajd2y69j92pvaglhfcwlg5qkly877