Abhay on Nostr: There is also some added context of this note, but I'll sit this one out if you don't ...
There is also some added context of this note, but I'll sit this one out if you don't like hearing contradictory opinions to your own
quoting nevent1q…p20sI love the idea of running AI models locally, but it seems like ultimately, scale is a huge issue, the larger the parameters/larger the model the more hardware you have to throw at it. At home, obviously you probably aren't gonna have more than one GPU. Certainly not 100s of thousands of gpus like the big boys have in their datacenters