-
-
Notifications
You must be signed in to change notification settings - Fork 3.8k
Closed
Labels
bugSomething isn't workingSomething isn't working
Description
What happened?
This happened after I pulled the latest image. I failed to fix it with my existing containers so here's what I did:
- Verify LiteLLM is still available as an API → ✅
- Listed models via
/v1/models
and saved the response as a backup - Removed the running container and deleted the image:
docker stop litellm && docker rm litellm && docker image rm ghcr.io/berriai/litellm:main-latest
- Recreated the container:
docker compose up -d
- Disabled pi-hole on the network just in case
The DB seems to work fine, but even after this the UI remains inaccessible and shows the log output below and the following in the browser:

My compose.yaml
:
services:
litellm:
image: ghcr.io/berriai/litellm:main-latest
container_name: litellm
env_file: litellm.env
volumes:
- ./data/config.yaml:/app/config.yaml
ports:
- 4000:4000
command: ['--config=/app/config.yaml']
restart: unless-stopped
litellm_db:
image: postgres
container_name: litellm_db
env_file: litellm.env
ports:
- 5432:5432
healthcheck:
test: ["CMD-SHELL", "pg_isready -d litellm -U llmproxy"]
interval: 1s
timeout: 5s
retries: 10
restart: unless-stopped
My litellm.env
file:
DATABASE_URL="postgresql://llmproxy:*******@litellm_db:5432/litellm"
STORE_MODEL_IN_DB=True
MASTER_KEY="*******"
POSTGRES_DB=litellm
POSTGRES_USER=llmproxy
POSTGRES_PASSWORD=*******
# Ollama
OLLAMA_API_BASE=*******
OLLAMA_API_KEY=""
MISTRAL_API_KEY=*******
LANGFUSE_PUBLIC_KEY=pk-*******
LANGFUSE_SECRET_KEY=sk-*******
LANGFUSE_HOST=*******
My LiteLLM config:
general_settings:
master_key: os.environ/MASTER_KEY
pass_through_endpoints:
- path: "/mistral/v1/ocr"
target: "https://api.mistral.ai/v1/ocr"
headers:
Authorization: "bearer os.environ/MISTRAL_API_KEY"
content-type: application/json
accept: application/json
forward_headers: True
model_list: []
Relevant log output
litellm | INFO: Shutting down
litellm | INFO: Waiting for application shutdown.
litellm | INFO: Application shutdown complete.
litellm | INFO: Finished server process [1]
litellm | prisma:warn Prisma doesn't know which engines to download for the Linux distro "wolfi". Falling back to Prisma engines built "debian".
litellm | Please report your experience by creating an issue at https://github.com/prisma/prisma/issues so we can add your distro to the list of known supported distros.
litellm | Prisma schema loaded from schema.prisma
litellm | Datasource "client": PostgreSQL database "litellm", schema "public" at "litellm_db:5432"
litellm |
litellm | The database is already in sync with the Prisma schema.
litellm |
Running generate... - Prisma Client Python (v0.11.0)
litellm |
litellm | Some types are disabled by default due to being incompatible with Mypy, it is highly recommended
litellm | to use Pyright instead and configure Prisma Python to use recursive types. To re-enable certain types:
litellm |
litellm | generator client {
litellm | provider = "prisma-client-py"
litellm | recursive_type_depth = -1
litellm | }
litellm |
litellm | If you need to use Mypy, you can also disable this message by explicitly setting the default value:
litellm |
litellm | generator client {
litellm | provider = "prisma-client-py"
litellm | recursive_type_depth = 5
litellm | }
litellm |
litellm | For more information see: https://prisma-client-py.readthedocs.io/en/stable/reference/limitations/#default-type-limitations
litellm |
✔ Generated Prisma Client Python (v0.11.0) to ./../../prisma in 388ms
litellm |
litellm | INFO: Started server process [1]
litellm | INFO: Waiting for application startup.
litellm | INFO: Application startup complete.
litellm | INFO: Uvicorn running on http://0.0.0.0:4000 (Press CTRL+C to quit)
litellm |
litellm | #------------------------------------------------------------#
litellm | # #
litellm | # 'It would help me if you could add...' #
litellm | # https://github.com/BerriAI/litellm/issues/new #
litellm | # #
litellm | #------------------------------------------------------------#
litellm |
litellm | Thank you for using LiteLLM! - Krrish & Ishaan
litellm |
litellm |
litellm |
litellm | Give Feedback / Get Help: https://github.com/BerriAI/litellm/issues/new
litellm |
litellm |
litellm | INFO: 192.168.1.x:58284 - "GET /ui/ HTTP/1.1" 200 OK
litellm | INFO: 192.168.1.x:58284 - "GET /litellm2/_next/static/chunks/webpack-a426aae3231a8df1.js HTTP/1.1" 404 Not Found
litellm | INFO: 192.168.1.x:58285 - "GET /litellm2/_next/static/chunks/fd9d1056-205af899b895cbac.js HTTP/1.1" 404 Not Found
litellm | INFO: 192.168.1.x:58284 - "GET /litellm2/_next/static/chunks/117-c4922b1dd81b62ce.js HTTP/1.1" 404 Not Found
litellm | INFO: 192.168.1.x:58286 - "GET /litellm2/_next/static/chunks/main-app-4f7318ae681a6d94.js HTTP/1.1" 404 Not Found
Are you a ML Ops Team?
No
What LiteLLM version are you on ?
always the latest main-stable
Twitter / LinkedIn details
No response
CuriousLocky, AdithyanI, itsparth, frenzybiscuit, olohmann and 3 moreolohmann and shloimy-wiesel
Metadata
Metadata
Assignees
Labels
bugSomething isn't workingSomething isn't working