Ramin Honary on Nostr: Be careful if you are running an #Ollama web serverAccording to this article if you ...
Be careful if you are running an #Ollama web serverAccording to this article if you run Ollama as a web server, meaning you are running an LLM model locally on your server or home computer, but you have a web portal open to it so people in your organization or home can connect to your server and ask the LLM questions, the Ollama web server is apparently full of security holes. The article mentions two problems:
It can leave your computer vulnerable to DDoS attacks from the public Internet
The push/pull feature for uploading/downloading models is vulnerable to man-in-the-middle attacks (possibly? as is my understanding)
Quoting the article:
the API can be exposed to the public internet; its functions to push, pull, and delete models can put data at risk and unauthenticated users can also bombard models with requests, potentially causing costs for cloud computing resource owners. Existing vulnerabilities within Ollama could also be exploited.
#tech #AI #Ollama #OpSec #ComputerSecurity
It can leave your computer vulnerable to DDoS attacks from the public Internet
The push/pull feature for uploading/downloading models is vulnerable to man-in-the-middle attacks (possibly? as is my understanding)
Quoting the article:
the API can be exposed to the public internet; its functions to push, pull, and delete models can put data at risk and unauthenticated users can also bombard models with requests, potentially causing costs for cloud computing resource owners. Existing vulnerabilities within Ollama could also be exploited.
#tech #AI #Ollama #OpSec #ComputerSecurity