What is Nostr?
Jeremiah Lee /
npub1h3y…aqg5
2024-09-16 15:53:33

Jeremiah Lee on Nostr: Paraphrased conclusion: Calling LLM inaccuracies ‘hallucinations’ feeds in to ...

Paraphrased conclusion: Calling LLM inaccuracies ‘hallucinations’ feeds in to hype about their abilities among technology cheerleaders. It can also lead to the wrong attitude towards the machine when it gets things right: the inaccuracies show that it is bullshitting, even when it’s right. Calling these inaccuracies ‘bullshit’ isn’t just more accurate; it’s good science and technology communication.

https://link.springer.com/article/10.1007/s10676-024-09775-5

#AI #ML #longRead
Author Public Key
npub1h3yk5lue66wzaxlkdl9euwvt67cl0tnn8tgq8e4cfzx7htc29zdq94aqg5