What is Nostr?
Marco Rogers /
npub1wml…gs3l
2024-04-02 19:29:01

Marco Rogers on Nostr: That's right. It's sort of the core issue in educating people about LLMs. They don't ...

That's right. It's sort of the core issue in educating people about LLMs. They don't "sometimes hallucinate". They always hallucinate. By design.

But I also think we should graduate from saying they "just" make things up. They have a sophisticated inner model that means they're more likely to hallucinate towards things that seem correct when read by humans. And studying how they are able to do that is actually interesting.
https://mastodon.social/@GeePawHill/112203183163246574
Author Public Key
npub1wmluy8g0a6kz7mh90m4r3mz8qv78ulce8etznxtj436fqatmv8yqlygs3l