Erik Jonker on Nostr: "If it does turn out to be anything like human understanding, it will probably not be ...
"If it does turn out to be anything like human understanding, it will probably not be based on LLMs.
After all, LLMs learn in the opposite direction from humans. LLMs start out learning language and attempt to abstract concepts. Human babies learn concepts first, and only later acquire the language to describe them."
https://www.sciencenews.org/article/ai-large-language-model-understanding
#AI #LLM #AGI
After all, LLMs learn in the opposite direction from humans. LLMs start out learning language and attempt to abstract concepts. Human babies learn concepts first, and only later acquire the language to describe them."
https://www.sciencenews.org/article/ai-large-language-model-understanding
#AI #LLM #AGI