Skip to content

Files

Latest commit

 

History

History

litellm

Folders and files

NameName
Last commit message
Last commit date

parent directory

..
 
 
 
 
 
 
 
 

litellm

OpenAI Proxy Server (LLM Gateway) to call 100+ LLMs in a unified interface & track spend, set budgets per virtual key/user.

$ LITELLM_KEY=sk-xxxxxx

$ curl -H "Authorization: Bearer $LITELLM_KEY" http://127.0.0.1:4000/v1/models

$ curl -H "Authorization: Bearer $LITELLM_KEY" http://127.0.0.1:4000/model/info

$ curl http://127.0.0.1:4000/v1/chat/completions \
  -H "Content-Type: application/json" \
  -H "Authorization: Bearer $LITELLM_KEY" \
  -d '{
    "model": "claude-3.5",
    "response_format": { "type": "json_object" },
    "messages": [
      {
        "role": "system",
        "content": "You are a helpful assistant designed to output JSON."
      },
      {
        "role": "user",
        "content": "Who won the world series in 2020?"
      }
    ]
  }'

To create virtual keys

$ curl --location 'http://127.0.0.1:4000/user/new' \
       --header "Authorization: Bearer $LITELLM_KEY" \
       --header 'Content-Type: application/json' \
       --data-raw '{"user_email": "username@example.com"}'
{
  "expires": "2023-12-22T09:53:13.861000Z",
  "user_id": "my-unique-id",
  "max_budget": 0.0
}

$ curl 'http://127.0.0.1:4000/key/generate' \
       --header "Authorization: Bearer $LITELLM_KEY" \
       --header 'Content-Type: application/json' \
       --data-raw '{"models": ["gpt-4o", "claude-3.5"], "user_id": "my-unique-id"}'

$ curl -H "Authorization: Bearer $LITELLM_KEY" 'http://127.0.0.1:4000/user/info?user_id=my-unique-id'
{
  "spend": 0
}

Install litellm manually

# https://github.com/BerriAI/litellm/blob/v1.58.2/pyproject.toml
$ pipx install 'litellm[proxy]==1.58.2'
$ source ~/.local/pipx/venvs/litellm/bin/activate
$ pip install prisma==0.11.0