Seán Fobbe on Nostr: From decades of trying we know that encoding facts one-by-one is a terrible way to ...
From decades of trying we know that encoding facts one-by-one is a terrible way to build "AI", but for some reason everyone is trying this Sisyphean method again to fix their model hallucinations.
Relying on symbolic AI solutions to fix data-driven AI that was supposed to supersede symbolic AI is probably the weirdest part of the current timeline.
Published at
2024-05-02 12:51:17Event JSON
{
"id": "14a598415626772b5d6fda0330d4121437d47469109d610876f6c309070b000f",
"pubkey": "18d84d4992f40fb18a60e2a77e5fee80fc55a9605c4d23f79c2290d35c9fa62e",
"created_at": 1714654277,
"kind": 1,
"tags": [
[
"proxy",
"https://fediscience.org/users/seanfobbe/statuses/112371582741212475",
"activitypub"
]
],
"content": "From decades of trying we know that encoding facts one-by-one is a terrible way to build \"AI\", but for some reason everyone is trying this Sisyphean method again to fix their model hallucinations.\n\nRelying on symbolic AI solutions to fix data-driven AI that was supposed to supersede symbolic AI is probably the weirdest part of the current timeline.",
"sig": "d18a957b58e3b162bcb6c010da28670add3e48ef397462bce7a415d3189af3a8ab7daf496e29c13992b51f4b3bd71149262dd87842cb87ea2ac80d6eb58c14fe"
}