freemymind on Nostr: A little uninformed answer from a person with technical knowledge: A local LLM (or ...
A little uninformed answer from a person with technical knowledge:
A local LLM (or AI) is a model, that runs only on the resources of your computer. Most of them need quiet a powermachine to be able to run I assume.
The other option is running the models on Servers of private companies, like OpenAI, Google, X or via duckai.
There you send your request to this server and they send you an answer. As always you have to assume that anything you do not pay for, will sell all your data.
So the difference is, that you get a faster response from a server, since they use more powerful hardware then most people have in their home. On the other side they will most probably sell your requests to advertisers, which is a privacyconcern.
Using the models via search.brave.com or https://duckduckgo.com/?q=DuckDuckGo+AI+Chat&ia=chat&duckai=1 to have anonymity at least. And when using opensource AI's like Claude you at least support models, that are publicly accessible 😉
A local LLM (or AI) is a model, that runs only on the resources of your computer. Most of them need quiet a powermachine to be able to run I assume.
The other option is running the models on Servers of private companies, like OpenAI, Google, X or via duckai.
There you send your request to this server and they send you an answer. As always you have to assume that anything you do not pay for, will sell all your data.
So the difference is, that you get a faster response from a server, since they use more powerful hardware then most people have in their home. On the other side they will most probably sell your requests to advertisers, which is a privacyconcern.
Using the models via search.brave.com or https://duckduckgo.com/?q=DuckDuckGo+AI+Chat&ia=chat&duckai=1 to have anonymity at least. And when using opensource AI's like Claude you at least support models, that are publicly accessible 😉