Prof. Emily M. Bender(she/her) on Nostr: If someone uses an LLM as a replacement for search, and the output they get is ...
If someone uses an LLM as a replacement for search, and the output they get is correct, this is just by chance.
Furthermore, a system that is right 95% of the time is arguably more dangerous tthan one that is right 50% of the time. People will be more likely to trust the output, and likely less able to fact check the 5%.
>>
Published at
2024-11-04 03:17:29Event JSON
{
"id": "050004a6fb176c53e0064c080840ae4be597f15ac872965e911c50938020932d",
"pubkey": "13ec9fd5058a18cd097d105fd6ef43759e37d5915b1c01ed36acf0ef5a3e6f2a",
"created_at": 1730690249,
"kind": 1,
"tags": [
[
"e",
"40ce36aaa153b4a604d6ff201e28b268302cff4db42517d00f8f7e870a75d039",
"wss://relay.mostr.pub",
"reply"
],
[
"proxy",
"https://dair-community.social/users/emilymbender/statuses/113422516201485052",
"activitypub"
]
],
"content": "If someone uses an LLM as a replacement for search, and the output they get is correct, this is just by chance.\n\nFurthermore, a system that is right 95% of the time is arguably more dangerous tthan one that is right 50% of the time. People will be more likely to trust the output, and likely less able to fact check the 5%.\n\n\u003e\u003e",
"sig": "7d3748cfc9b45b10a5559ade608707b90bd75428db6fcc81786985e564f57dc2aec0481fce09ecbe1c7b226052a38461aedbaa0ee13e71c84a62b877e11f5050"
}