What is Nostr?
karen coyle /
npub1wk3…gvyz
2024-05-12 16:07:27

karen coyle on Nostr: re: AI bots "hallucinating". Calling it "hallucination" excuses the fact that they ...

re: AI bots "hallucinating". Calling it "hallucination" excuses the fact that they just plain LIE.

LLMs and ChatBots LIE.

They could be programmed not to.

Hallucinations are cute and fuzzy and non-threatening. They also are sensory, and AI has no senses.

Call it LYING, please!
Author Public Key
npub1wk35vqrvy59gl40ef9myddq9jq6ekj5lkqdh67027fem0wj3yymsd7gvyz