Albert Cardona on Nostr: Hard to argue the conclusion of this paper isn't right. #ChatGPT does not hallucinate ...
Hard to argue the conclusion of this paper isn't right.
#ChatGPT does not hallucinate (non-standard perception experience unrelated to the world), and does not confabulate (fill in a memory gap). Instead, it soft bullshits:
"Bullshit produced without the intention to mislead the hearer regarding the utterer’s agenda."
"ChatGPT is bullshit", Hicks et al. 2024 https://link.springer.com/article/10.1007/s10676-024-09775-5
#ChatGPT does not hallucinate (non-standard perception experience unrelated to the world), and does not confabulate (fill in a memory gap). Instead, it soft bullshits:
"Bullshit produced without the intention to mislead the hearer regarding the utterer’s agenda."
"ChatGPT is bullshit", Hicks et al. 2024 https://link.springer.com/article/10.1007/s10676-024-09775-5