Skip to content

Gab-r-x/LeanLLM

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

2 Commits
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

LeanLLM

Lightweight Python wrapper around LiteLLM with built-in usage tracking and label support.

Installation

pip install leanllm

Or install locally for development:

pip install -e .

Quickstart

from leanllm import LeanLLM

client = LeanLLM(api_key="sk-...")

response = client.chat(
    model="gpt-4o-mini",
    messages=[{"role": "user", "content": "Hello!"}],
    labels={"team": "backend", "feature": "onboarding"},
)

print(response.choices[0].message.content)

Labels

Every request accepts an optional labels dict. Labels are attached to the usage event logged for that call, making it easy to slice costs and latency by team, feature, environment, or any dimension you define.

Usage logs

Each call appends a JSON line to llm_logs.json (configurable via LEANLLM_LOG_FILE / LEANLLM_LOG_DIR env vars):

{"model": "gpt-4o-mini", "prompt_tokens": 12, "completion_tokens": 8, "total_tokens": 20, "latency_ms": 432.1, "labels": {"team": "backend"}}

About

No description, website, or topics provided.

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

 
 
 

Contributors

Languages