chris martens on Nostr: so, ok, more people are realizing LLMs give wrong answers and have no viable path to ...
so, ok, more people are realizing LLMs give wrong answers and have no viable path to improvement,
but in most cases i’m not convinced correctness was ever the point. i’m not hopeful that these realizations are going to change the tide of adoption, and rather predict we’ll see an uptick in people trying to convince us facts aren’t real
Published at
2024-05-26 14:48:30Event JSON
{
"id": "de7cf25cae7f36828ef3b14e0be00827519453594c69e9bbe2b6e2c1bd76d111",
"pubkey": "20620c8fffc921adb198ad4e702673713d1543b1c59fbc83c4e652226a62dfa4",
"created_at": 1716734910,
"kind": 1,
"tags": [
[
"proxy",
"https://hci.social/@chrisamaphone/112507939090328589",
"web"
],
[
"proxy",
"https://hci.social/users/chrisamaphone/statuses/112507939090328589",
"activitypub"
],
[
"L",
"pink.momostr"
],
[
"l",
"pink.momostr.activitypub:https://hci.social/users/chrisamaphone/statuses/112507939090328589",
"pink.momostr"
]
],
"content": "so, ok, more people are realizing LLMs give wrong answers and have no viable path to improvement,\n\nbut in most cases i’m not convinced correctness was ever the point. i’m not hopeful that these realizations are going to change the tide of adoption, and rather predict we’ll see an uptick in people trying to convince us facts aren’t real",
"sig": "1cddf8692f42366fabbb5fe91765e4737293fff349a04523ccaabd353d9f960be18985d17fb406ed069f7fb1a29667dbfc5c5ea8149dcde3280230f9d5941eba"
}