karen coyle on Nostr: re: AI bots "hallucinating". Calling it "hallucination" excuses the fact that they ...
re: AI bots "hallucinating". Calling it "hallucination" excuses the fact that they just plain LIE.
LLMs and ChatBots LIE.
They could be programmed not to.
Hallucinations are cute and fuzzy and non-threatening. They also are sensory, and AI has no senses.
Call it LYING, please!
Published at
2024-05-12 16:07:27Event JSON
{
"id": "f66f0d1f82e0efaaa382fb7fbb9b264f49fa1f812584f4a710b8d914df8eddc2",
"pubkey": "75a346006c250a8fd5f9497646b40590359b4a9fb01b7d79eaf273b7ba512137",
"created_at": 1715530047,
"kind": 1,
"tags": [
[
"proxy",
"https://mstdn.social/users/kcoyle/statuses/112428977191581279",
"activitypub"
]
],
"content": "re: AI bots \"hallucinating\". Calling it \"hallucination\" excuses the fact that they just plain LIE. \n\nLLMs and ChatBots LIE.\n\nThey could be programmed not to. \n\nHallucinations are cute and fuzzy and non-threatening. They also are sensory, and AI has no senses. \n\nCall it LYING, please!",
"sig": "d5f8a8592551c2214a54024e324599062e1aee9c8ac7b121de05585264e9f8f286077cfea005586fd9eb600d9b3a7d638e872ffcd5c15595a9ce2f1c0ee116d2"
}