What is Nostr?
Terence Tao /
npub1hsf…7r3k
2024-11-14 19:27:07
in reply to nevent1q…6l54

Terence Tao on Nostr: In AI, one instance of this principle is in Rick Sutton's "bitter lesson" ...

In AI, one instance of this principle is in Rick Sutton's "bitter lesson" http://www.incompleteideas.net/IncIdeas/BitterLesson.html . It is intuitively obvious, and initially somewhat successful, to automate a task by carefully designing mathematical algorithms tailored to the specific domain knowledge of the task at hand. However, for a surprisingly large number of tasks, we have learned that even more progress can often be achieved by applying relatively low-tech but general purpose mathematical methods, such as gradient descent and backpropagation, but with large scale investments in data and compute, rather than in tailoring. One example I have seen recently is in the development of cheap analog-to-digital converters (ADC) for sensor networks. Traditionally, ADC circuits are designed from classic electrical engineering principles, using mathematical results about ODEs, resonances, Fourier transforms, and so forth to obtain reasonably efficient circuits with theoretically guarantees on performance. But in environments (such as sensor networks) where some failure rate is tolerated, and the goal is to obtain fast and cheap analog-to-digital conversion at scale, it turns out to be better to simply train a neural network to design an ADC circuit without inputting any domain knowledge (such as Fourier analysis), and use the resulting circuits in the sensors. This is not to say that domain knowledge is completely useless - for instance, physics-informed neural networks can greatly outperform standard neural networks in many physical contexts - but one has to know how much of it is appropriate to apply. (2/3)
Author Public Key
npub1hsf727dlfy55vvm5wuqwyh457uwsc24pxn5f7vxnd4lpvv8phw3sjm7r3k