Marcos Huerta on Nostr: #[0] The lack of tying the information back to the source seems a bigger problem. ...
Published at
2023-04-06 15:37:52Event JSON
{
"id": "5f9529ab86cf60e62888485890af887c898e2434bcc54678617ac2568b0f4bb0",
"pubkey": "6905afb3809d93686c2c86527ff0aca71dc98b51c52d177a3a2395eb6f49effe",
"created_at": 1680795472,
"kind": 1,
"tags": [
[
"p",
"4ebb1885240ebc43fff7e4ff71a4f4a1b75f4e296809b61932f10de3e34c026b",
"wss://relay.mostr.pub"
],
[
"p",
"8b0be93ed69c30e9a68159fd384fd8308ce4bbf16c39e840e0803dcb6c08720e",
"wss://relay.mostr.pub"
],
[
"e",
"bc7398da4256e353067dedd8598430dff3a6a790be37b0b805993e5bdc711077",
"wss://relay.mostr.pub",
"reply"
],
[
"mostr",
"https://vmst.io/users/marcoshuerta/statuses/110152612074438217"
]
],
"content": "#[0] The lack of tying the information back to the source seems a bigger problem. Google results for better or worse I can scrutinize the site the info comes from. LLMs invent references if asked and otherwise don’t surface where/how in its vast training data it has stiched together its answers.",
"sig": "55d5259ff786f306b1ca9934669a960a8c649458042b13e5f5a866e49f34921c04a7b0fa165daf97be3a328ac9abf77943e549164bc65fdb3689d573f04f28b2"
}