Tim Kellogg on Nostr: i used an analogy yesterday, that #LLMs are basically system 1 (from Thinking Fast ...
i used an analogy yesterday, that #LLMs are basically system 1 (from Thinking Fast and Slow), and system 2 doesn’t exist but we can kinda fake it by forcing the LLM to have an internal dialog.
my understanding is that system 1 was more tuned to pattern matching and “gut reactions”, while system 2 is more analytical
i think it probably works pretty well, but curious what others think
Published at
2024-05-08 11:31:26Event JSON
{
"id": "e344457a902e511dd01f2276c218b526b3301308bfa27fee604dedae21dab047",
"pubkey": "ad159d25c6d90f397ab2c21dca6492cb42079f31b8d80c9970d17c80802bd8a3",
"created_at": 1715167886,
"kind": 1,
"tags": [
[
"t",
"LLMs"
],
[
"proxy",
"https://hachyderm.io/users/kellogh/statuses/112405242607158184",
"activitypub"
]
],
"content": "i used an analogy yesterday, that #LLMs are basically system 1 (from Thinking Fast and Slow), and system 2 doesn’t exist but we can kinda fake it by forcing the LLM to have an internal dialog.\n\nmy understanding is that system 1 was more tuned to pattern matching and “gut reactions”, while system 2 is more analytical\n\ni think it probably works pretty well, but curious what others think",
"sig": "faecd366dd7b893bc12e60c1091dbbe8dbd1e0a5842923f8f1adbd110afe5f0a21748328f413674134bd37379fa6cf796c66f46e545a6b7dcb5c9012a4ac3dd4"
}