Skip to content

Lightweight Python SDK that gives you a single predictable interface for working with multiple LLM providers.

Notifications You must be signed in to change notification settings

relay-python/relay

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

1 Commit
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

relay

relay is a lightweight SDK that gives you a single, predictable interface for working with large language model (LLM) providers. You only plug in a provider name and an API key—everything else is handled for you.

Out of the box, Relay knows how to talk to OpenAI’s Chat Completions API and Anthropic’s Messages API, and the provider system is designed to expand without changing the surface area of your code.

Quick start

from relay import LLM

client = LLM(name="openai:gpt-4o-mini", api_key="sk-...")
result = client.chat([{"role": "user", "content": "Write a haiku about debugging."}])
print(result.content)

If you omit the model (e.g. LLM(name="openai", api_key="...")), a sensible provider default is used automatically. Responses always include the selected model, the provider identifier, and the raw payload for advanced use cases.

Want to swap providers? Just change the identifier:

client = LLM(name="anthropic:claude-3-5-sonnet-20241022", api_key="sk-ant-...")

See the examples/ directory for ready-to-run scripts using OPENAI_API_KEY or ANTHROPIC_API_KEY.

Why another SDK?

  • Uniform client – Stick to the same front-end layer while swapping providers in configuration.
  • Zero provider boilerplate – Pass the provider identifier (e.g. openai) and an API key; the SDK selects endpoints, models, and headers.
  • Extensible design – New providers just implement the shared BaseProvider interface. The rest of the stack remains unchanged.

Roadmap

  • Add more providers (e.g. Mistral, Gemini, local runtimes).
  • Support streaming interfaces and tool/function calling.
  • Ship sync + async variants for production workloads.

Development

python -m venv .venv
source .venv/bin/activate
pip install -e .[dev]
pytest

Contributions, ideas, and feedback are welcome!

See DEVELOPER_GUIDELINES.md for coding standards, testing expectations, and the release checklist.

About

Lightweight Python SDK that gives you a single predictable interface for working with multiple LLM providers.

Topics

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages