Aeon.Cypher on Nostr: #Generative #AI is not going to make #AGI, ever. The performance of LLMs is the log ...
#Generative #AI is not going to make #AGI, ever.
The performance of LLMs is the log of their parameter size. The resource use is proportional to their parameter size.
The human brain is approximately equal to a 60 quintilion parameter model running at 80hz. It consumes the energy of a single lightbulb.
GPT-5 is likely a 1 trillion parameter model, and requires massive energy.
LLMs are amazing, but they're nowhere near approaching AGI.
https://www.wired.com/story/how-quickly-do-large-language-models-learn-unexpected-skills/
The performance of LLMs is the log of their parameter size. The resource use is proportional to their parameter size.
The human brain is approximately equal to a 60 quintilion parameter model running at 80hz. It consumes the energy of a single lightbulb.
GPT-5 is likely a 1 trillion parameter model, and requires massive energy.
LLMs are amazing, but they're nowhere near approaching AGI.
https://www.wired.com/story/how-quickly-do-large-language-models-learn-unexpected-skills/