42N3 on Nostr: As long as AIs hallucinate when they don't really know an answer, I would be very ...
As long as AIs hallucinate when they don't really know an answer, I would be very cautious about believing anything outright.
Published at
2025-02-20 11:25:22Event JSON
{
"id": "48f2db6992b02631e813e80b203d11dd46e8d6d22df59517bc6f5748d60a4110",
"pubkey": "cc33d9331c022da690a714db2cfb57c36a7a09dbbc8d512d1375483e0347fdcc",
"created_at": 1740050722,
"kind": 1,
"tags": [
[
"e",
"c2b2024363185fff2c393bf2881994e19f9ac3e30b0705d327c4f7f0aa0c1f86",
"wss://nostr.oxtr.dev/",
"root",
"1bc70a0148b3f316da33fe3c89f23e3e71ac4ff998027ec712b905cd24f6a411"
],
[
"e",
"c2b2024363185fff2c393bf2881994e19f9ac3e30b0705d327c4f7f0aa0c1f86",
"wss://nostr.oxtr.dev/",
"reply",
"1bc70a0148b3f316da33fe3c89f23e3e71ac4ff998027ec712b905cd24f6a411"
],
[
"p",
"1bc70a0148b3f316da33fe3c89f23e3e71ac4ff998027ec712b905cd24f6a411"
]
],
"content": "As long as AIs hallucinate when they don't really know an answer, I would be very cautious about believing anything outright.",
"sig": "f68e8509cb83bc86fa34c5357150bddada4d95c3065b0decab1f531bac79ec19d7b9de3843f03f0e6260e9f11e951bd98840b6436b111c3faa8377a7e3d7e45b"
}