What is Nostr?
Aeon.Cypher /
npub1m4g…xmwc
2024-03-24 17:40:04

Aeon.Cypher on Nostr: #Generative #AI is not going to make #AGI, ever. The performance of LLMs is the log ...

#Generative #AI is not going to make #AGI, ever.

The performance of LLMs is the log of their parameter size. The resource use is proportional to their parameter size.

The human brain is approximately equal to a 60 quintilion parameter model running at 80hz. It consumes the energy of a single lightbulb.

GPT-5 is likely a 1 trillion parameter model, and requires massive energy.

LLMs are amazing, but they're nowhere near approaching AGI.

https://www.wired.com/story/how-quickly-do-large-language-models-learn-unexpected-skills/
Author Public Key
npub1m4geggwylwpkva339275uz76cae7vpqd0y5m4u95966wzfhhee4qnqxmwc