-
Notifications
You must be signed in to change notification settings - Fork 6k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
How to secure the API with api key #849
Comments
The solution to this for the time being would be to add an authenticating proxy in front of Ollama (ex: nginx with basic auth) |
Here's how you add HTTP Basic Auth with
And if you want to run it as a system service, or without HTTPS or need other details, I've got a bunch of snippets up at https://webinstall.dev/caddy. |
With caddy you can also do multiple users. You can create a little interface to add users and serve it with the tool. That makes using ollama much safer |
why the authorization mechanism cannot be built into the ollama server? |
If you are interested, I have built a proxy server for ollama: It allows those features: Ollama is a very powerful and fast generation tool. But I think with the new proxy, it takes the potential to a new level. As client I use lollms (obviously) :) |
by default does it require api key for it to work how do i find this api key or it not use any api key for the web ui that i have working with ollama ? im on linux and use http://localhost:8080/ but ofc it has a backend port and python code im trying to use as an agent is asking for an api key any one help clarify please? p.s i may of solved it cd ~/Documents/Scripts/AI/Ollama/open-webui/backend cat .webui_secret_key |
4532: Add `url` and `api_key` to ollama r=ManyTheFish a=dureuill See [Usage page](https://meilisearch.notion.site/v1-8-AI-search-API-usage-135552d6e85a4a52bc7109be82aeca42#5c77ef49e78e43388c1d3d5429151357) ### Motivation - Before this PR, the url for ollama is only read from the environment. This is a needless restriction that will be troublesome in settings where passing an environment variable is complex or impossible (e.g., the Cloud) - Before this PR, ollama did not support an api_key. While ollama does not natively support API keys, [a common practice](ollama/ollama#849) is to put a publicly accessible ollama server behind a proxy to support authentication. ### Skip changelog ollama embedder was added to v1.8 Co-authored-by: Louis Dureuil <louis@meilisearch.com>
@coolaj86 Hey I tried to use your steps, I am running it on macos, but I get this error
HERE is my config
|
The environment variables do not match in this example. Try replacing |
I am using this configuration and it does prompt me to enter username and password, however when I do so it gives
Logs when I enter the username and password
|
Sorry, this isn't solving the precise issue discussed on the couple of previous messages, but as an alternative solution for the subject of this thread -- I have developed a quickly deployable solution to not have to deal with nginx |
I was playing for a few days to get the Ollama Go and the llama.cpp server to work with native For using For supporting multiple API keys stored in a config file, check out this repo: |
It would be great for Ollama to directly support api token with ssl! |
We have deployed OLLAMA container with zephyr model inside kubernetes , so as a best practice we want to secure the endpoints via api key similar way to OpenAI , so is there any way to do this ?
The text was updated successfully, but these errors were encountered: