apurvns on Nostr: GPT like AI model learns from astronomical amounts of language input -- much more ...
GPT like AI model learns from astronomical amounts of language input -- much more than children receive when learning how to understand and speak a language. The best AI systems trains on text with a word count in the trillions, whereas children receive just millions per year.
Due to this enormous data gap, researchers have been skeptical that recent AI advances can tell us much about human learning and development.
A team of New York University researchers trained a multimodal AI system through the eyes and ears of a single child, using headcam video recordings from when the child was six months and through their second birthday. Their findings showed that the model, or neural network, could, in fact, learn a substantial number of words and concepts using limited slices of what the child experienced.
The video only captured about 1% of the child's waking hours, but that was sufficient for genuine language learning.
Due to this enormous data gap, researchers have been skeptical that recent AI advances can tell us much about human learning and development.
A team of New York University researchers trained a multimodal AI system through the eyes and ears of a single child, using headcam video recordings from when the child was six months and through their second birthday. Their findings showed that the model, or neural network, could, in fact, learn a substantial number of words and concepts using limited slices of what the child experienced.
The video only captured about 1% of the child's waking hours, but that was sufficient for genuine language learning.