Join Nostr
2025-06-23 15:36:41 UTC

aljaz on Nostr: ollama proxy with free mode -> if you just need *some* llm backend to work i've made ...

ollama proxy with free mode -> if you just need *some* llm backend to work i've made an ollama/openai proxy that pulls currently free models from openrouter and routes to them, if one fails the next one is hit.

you can use filters to say you only want mistral models out of the free models for example. It also supports paid models

https://github.com/aljazceru/ollama-free-model-proxy