What is Nostr?
John Carlos Baez /
npub17u6…pd6m
2024-10-08 22:52:57
in reply to nevent1q…rw85

John Carlos Baez on Nostr: npub10l889…eath2 - but I didn't say "neural networks research is physics". I say ...

npub10l88929ze7h5h7g7pu7y0e0077c03zqtp9zk63zpx4gg7m2ypcnq8eath2 (npub10l8…ath2) - but I didn't say "neural networks research is physics". I say "it all started with physics", and gave evidence for how Hopfield and Hinton were using ideas from statistical mechanics. Yes, neural network research has grown beyond this now - it's now its own subject. I don't think anyone is claiming otherwise.

Wikipedia on Hopfield networks:

"The Sherrington–Kirkpatrick model of spin glass, published in 1975, is the Hopfield network with random initialization. Sherrington and Kirkpatrick found that it is highly likely for the energy function of the SK model to have many local minima. In the 1982 paper, Hopfield applied this recently developed theory to study the Hopfield network with binary activation functions. In a 1984 paper he extended this to continuous activation functions. It became a standard model for the study of neural networks through statistical mechanics."

Wikipedia on Boltzmann machines:

"They are named after the Boltzmann distribution in statistical mechanics, which is used in their sampling function. They were heavily popularized and promoted by Geoffrey Hinton, Terry Sejnowski and Yann LeCun in cognitive sciences communities, particularly in machine learning, as part of "energy-based models" (EBM), because Hamiltonians of spin glasses as energy are used as a starting point to define the learning task."
Author Public Key
npub17u6xav5rjq4d48fpcyy6j05rz2xelp7clnl8ptvpnval9tvmectqp8pd6m