Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

How to secure the API with api key #849

Closed
ajasingh opened this issue Oct 20, 2023 · 12 comments
Closed

How to secure the API with api key #849

ajasingh opened this issue Oct 20, 2023 · 12 comments

Comments

@ajasingh
Copy link

We have deployed OLLAMA container with zephyr model inside kubernetes , so as a best practice we want to secure the endpoints via api key similar way to OpenAI , so is there any way to do this ?

@BruceMacD
Copy link
Contributor

The solution to this for the time being would be to add an authenticating proxy in front of Ollama (ex: nginx with basic auth)

@coolaj86
Copy link

coolaj86 commented Oct 21, 2023

Here's how you add HTTP Basic Auth with caddy as a reverse proxy to localhost:11434, and also handle HTTPS automatically:

  1. Install caddy
    # Mac, Linux
    curl https://webi.sh/caddy | sh
    
    # Windows
    curl.exe https://webi.ms/caddy | powershell
  2. Put your password (which could be an API Token) in a password.txt
  3. Digest the password
    caddy hash-password < ./password.txt
  4. Put the username and digest in an ENV file
    caddy.env:
    BASIC_AUTH_USER='apitoken'
    BASIC_USER_AUTH='$2a$14$sI1j0RbhzKHMZ4cHU8otHOkB3Dgl9egF2D.CXB6C0/Qk5dtaMHS/u'
    
  5. Create a Caddyfile with basic auth using the ENVs
    api.example.com {
        handle /* {
            basicauth {
                {env.BASIC_AUTH_USERNAME} {env.BASIC_AUTH_DIGEST}
            }
            reverse_proxy localhost:11434
        }
    }
  6. Run caddy
    caddy run --config ./Caddyfile --envfile ./caddy.env

And if you want to run it as a system service, or without HTTPS or need other details, I've got a bunch of snippets up at https://webinstall.dev/caddy.

This was referenced Oct 25, 2023
@mxyng mxyng closed this as not planned Won't fix, can't repro, duplicate, stale Oct 25, 2023
@ParisNeo
Copy link
Contributor

ParisNeo commented Jan 9, 2024

With caddy you can also do multiple users. You can create a little interface to add users and serve it with the tool. That makes using ollama much safer

@DimIsaev
Copy link

DimIsaev commented Jan 15, 2024

why the authorization mechanism cannot be built into the ollama server?

@ParisNeo
Copy link
Contributor

ParisNeo commented Jan 16, 2024

If you are interested, I have built a proxy server for ollama:
https://github.com/ParisNeo/ollama_proxy_server

It allows those features:
1 - Authentication with user:key Bearer
2 - Adding new users
3 - Logging access to the service (useful for statictics). By default the proxy doesn't log your requests, it only logs that you requested generation which is useful for statistics and for dimensionning the network.
4 - Routing to multiple ollama instances. For example you can have multiple ollama servers and use a single endpoint that will take care of dispatching the generation requests to the different servers . Each server has its own generation queue and the proxy will always forward the request to the server with the least number of requests in the queue. This makes it possible to serve simultaniously multiple clients while garanteeing a minimum latency.

Ollama is a very powerful and fast generation tool. But I think with the new proxy, it takes the potential to a new level.

As client I use lollms (obviously) :)
https://github.com/ParisNeo/lollms-webui

@jamieduk
Copy link

jamieduk commented Mar 11, 2024

by default does it require api key for it to work how do i find this api key or it not use any api key for the web ui that i have working with ollama ? im on linux and use http://localhost:8080/ but ofc it has a backend port and python code im trying to use as an agent is asking for an api key any one help clarify please?

p.s i may of solved it

cd ~/Documents/Scripts/AI/Ollama/open-webui/backend

cat .webui_secret_key

meili-bors bot added a commit to meilisearch/meilisearch that referenced this issue Mar 28, 2024
4532: Add `url` and `api_key` to ollama r=ManyTheFish a=dureuill

See [Usage page](https://meilisearch.notion.site/v1-8-AI-search-API-usage-135552d6e85a4a52bc7109be82aeca42#5c77ef49e78e43388c1d3d5429151357)

### Motivation

- Before this PR, the url for ollama is only read from the environment. This is a needless restriction that will be troublesome in settings where passing an environment variable is complex or impossible (e.g., the Cloud)
- Before this PR, ollama did not support an api_key. While ollama does not natively support API keys, [a common practice](ollama/ollama#849) is to put a publicly accessible ollama server behind a proxy to support authentication.

### Skip changelog

ollama embedder was added to v1.8

Co-authored-by: Louis Dureuil <louis@meilisearch.com>
@alx-xlx
Copy link

alx-xlx commented May 30, 2024

@coolaj86 Hey I tried to use your steps, I am running it on macos, but I get this error

Error: loading initial config: loading new config: loading http app module: provision http: server srv0: setting up route handlers: route 0: loading handler modules: position 0: loading module 'subroute': provision http.handlers.subroute: setting up subroutes: route 0: loading handler modules: position 0: loading module 'subroute': provision http.handlers.subroute: setting up subroutes: route 0: loading handler modules: position 0: loading module 'authentication': provision http.handlers.authentication: loading authentication providers: module name 'http_basic': provision http.authentication.providers.http_basic: account 0: username and password are required

HERE is my config

BASIC_AUTH_USER='apitoken'
BASIC_USER_AUTH='$2a$14$Z7jW916wS4WAmpUBBuuEreMu51s09pC1I/exwdaHtPENtsxAhuniK'
api-ollama.example.org {
    tls {
        dns cloudflare _Q*************************-VP
    }
    handle /* {
        basicauth {
            {env.BASIC_AUTH_USERNAME} {env.BASIC_AUTH_DIGEST}
        }
        reverse_proxy localhost:11434
    }
}

@tandriamil
Copy link

The environment variables do not match in this example. Try replacing env.BASIC_AUTH_USERNAME by env.BASIC_AUTH_USER and env.BASIC_AUTH_DIGEST by BASIC_USER_AUTH. 😉

@alx-xlx
Copy link

alx-xlx commented May 31, 2024

I am using this configuration and it does prompt me to enter username and password, however when I do so it gives ERROR : 403

api-ollama.example.org {
    tls {
        dns cloudflare _QRi******************5u-VP
    }
    handle /* {
        basicauth {
            {env.BASIC_AUTH_USER} {env.BASIC_USER_AUTH}
        }
        reverse_proxy localhost:11434
    }
    log {
        output file /Users/USERNAME/Server/data/customcaddy/api-ollama.log {
            roll_size 100mb
            roll_keep 5
            roll_keep_for 720h
        }
    }
}

Logs when I enter the username and password

2024/05/31 09: 36: 37.104	error	http.log.access.log0	handled request	{
    "request": {
        "remote_ip": "192.168.0.122",
        "remote_port": "62601",
        "client_ip": "192.168.0.122",
        "proto": "HTTP/2.0",
        "method": "GET",
        "host": "api-ollama.example.org",
        "uri": "/",
        "headers": {
            "User-Agent": [
                "Mozilla/5.0 (Windows NT 10.0; Win64; x64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/125.0.0.0 Safari/537.36"
            ],
            "Sec-Fetch-User": [
                "?1"
            ],
            "Sec-Fetch-Dest": [
                "document"
            ],
            "Sec-Ch-Ua-Mobile": [
                "?0"
            ],
            "Sec-Ch-Ua": [
                "\"Google Chrome\";v=\"125\", \"Chromium\";v=\"125\", \"Not.A/Brand\";v=\"24\""
            ],
            "Sec-Fetch-Mode": [
                "navigate"
            ],
            "Accept-Encoding": [
                "gzip, deflate, br, zstd"
            ],
            "Save-Data": [
                "on"
            ],
            "Authorization": [],
            "Accept-Language": [
                "en-US,en;q=0.9,hi;q=0.8"
            ],
            "Priority": [
                "u=0, i"
            ],
            "Sec-Ch-Ua-Platform": [
                "\"Windows\""
            ],
            "Upgrade-Insecure-Requests": [
                "1"
            ],
            "Accept": [
                "text/html,application/xhtml+xml,application/xml;q=0.9,image/avif,image/webp,image/apng,*/*;q=0.8,application/signed-exchange;v=b3;q=0.7"
            ],
            "Sec-Fetch-Site": [
                "none"
            ],
            "Cache-Control": [
                "max-age=0"
            ]
        },
        "tls": {
            "resumed": false,
            "version": 772,
            "cipher_suite": 4865,
            "proto": "h2",
            "server_name": "api-ollama.example.org"
        }
    },
    "bytes_read": 0,
    "user_id": "admin",
    "duration": 0.994463625,
    "size": 0,
    "status": 403,
    "resp_headers": {
        "Date": [
            "Fri, 31 May 2024 09:36:37 GMT"
        ],
        "Content-Length": [
            "0"
        ],
        "Server": [
            "Caddy"
        ],
        "Alt-Svc": [
            "h3=\":443\"; ma=2592000"
        ]
    }
}

@dheavy
Copy link

dheavy commented Jun 2, 2024

Sorry, this isn't solving the precise issue discussed on the couple of previous messages, but as an alternative solution for the subject of this thread -- I have developed a quickly deployable solution to not have to deal with nginx

https://github.com/dheavy/ollama-secure-proxy

@bartolli
Copy link

bartolli commented Jun 6, 2024

I was playing for a few days to get the Ollama Go and the llama.cpp server to work with native api_key authentication but didn't have much luck with the custom build. It seems Ollama build does not rebuild llama.cpp, or at least I didn't figure it out how. So, I created a Docker image with a Caddy server to securely manage authentication and proxy requests to a local Ollama instance. You can choose between two methods: environment-based API key validation or using multiple API keys stored in a .conf file for extra security. Check out these repos:

For using OLLAMA_API_KEY as a local environment variable:
https://github.com/bartolli/ollama-bearer-auth

For supporting multiple API keys stored in a config file, check out this repo:
https://github.com/bartolli/ollama-bearer-auth-caddy.

@chigkim
Copy link

chigkim commented Jun 22, 2024

It would be great for Ollama to directly support api token with ssl!

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests