That Frisian Girl-ish on Nostr: In school we heard "you won't carry a calculator around all the time" as reason for ...
In school we heard "you won't carry a calculator around all the time" as reason for doing math on paper. OK, that was a total goof, because now we all have phones that, potentially, can serve as arbitrarily powerful calculator anytime and anywhere.
But that's not the gotcha you think it is. I don't think teachers were just blind to technological progress here, they were fully aware of one thing that they neglected to teach: it's a tool, and a tool is only as powerful as the mind that uses it. If you don't know how something works, you fall back into a magical understanding: you throw in ingredients on one side, and a solution pops out on the other side. There is no way for you to tell if the result is reasonable, or the magical process completely goofed out. Also, you will be completely dependent on the tool where a little understanding could show you shortcuts that make things easier for you, or at least feed the machine easier.
We're now in a worse situation like this concerning LLMs. They are branded AIs, advertised as machines that give answers, and if you don't know the shape of the answer already, and have no idea about the facts, you are completely at the mercy of a magical process that tells you fairytales. When things don't work with that answer, you are completely adrift with no idea how to fix it. Because a tool is only as powerful as the mind that uses it.
But that's not the gotcha you think it is. I don't think teachers were just blind to technological progress here, they were fully aware of one thing that they neglected to teach: it's a tool, and a tool is only as powerful as the mind that uses it. If you don't know how something works, you fall back into a magical understanding: you throw in ingredients on one side, and a solution pops out on the other side. There is no way for you to tell if the result is reasonable, or the magical process completely goofed out. Also, you will be completely dependent on the tool where a little understanding could show you shortcuts that make things easier for you, or at least feed the machine easier.
We're now in a worse situation like this concerning LLMs. They are branded AIs, advertised as machines that give answers, and if you don't know the shape of the answer already, and have no idea about the facts, you are completely at the mercy of a magical process that tells you fairytales. When things don't work with that answer, you are completely adrift with no idea how to fix it. Because a tool is only as powerful as the mind that uses it.