bitcoiner7 nym on Nostr: And I think, when you run ollama and oterm locally, you can use the models and have ...
And I think, when you run ollama and oterm locally, you can use the models and have privacy.
At least I tested and it still worked with wifi turned off.
Of course it's not impossible that it is storing my prompts and sending them somewhere later once I'm back online.
But it's open source so I guess someone would find out.
At least I tested and it still worked with wifi turned off.
Of course it's not impossible that it is storing my prompts and sending them somewhere later once I'm back online.
But it's open source so I guess someone would find out.