Skip to content

feat: add support for OpenAI/Anthropic-compatible APIs and Vertex AI …#200

Open
XinyueZ wants to merge 3 commits intowithceleste:mainfrom
XinyueZ:feat/openai-anthropic-compat
Open

feat: add support for OpenAI/Anthropic-compatible APIs and Vertex AI …#200
XinyueZ wants to merge 3 commits intowithceleste:mainfrom
XinyueZ:feat/openai-anthropic-compat

Conversation

@XinyueZ
Copy link

@XinyueZ XinyueZ commented Feb 26, 2026

…documentation

🤔 My idea is that if a certain model doesn't have a provider yet, and if that model's service is compatible with OpenAI or Anthropic, then this PR provides valuable information for using that model.

Add environment variable configuration for custom base URLs:

  • OPENAI_BASE_URL: Support OpenAI-compatible APIs (......)
  • ANTHROPIC_BASE_URL: Support Anthropic-compatible APIs (MiniMax.....)

Modified provider configs:

  • src/celeste/providers/openai/audio/config.py
  • src/celeste/providers/openai/images/config.py
  • src/celeste/providers/openai/responses/config.py
  • src/celeste/providers/openai/videos/config.py
  • src/celeste/providers/anthropic/messages/config.py

Documentation updates:

  • Add "Using Google Vertex AI" section with setup and usage examples
  • Add "Using OpenAI/Anthropic-Compatible APIs" section with MiniMax example
  • Include optional model registration guide to eliminate warnings
  • Add links to example notebooks for both sections

Example notebooks:

  • notebooks/vertexai-example.ipynb: Vertex AI with GoogleADC authentication
  • notebooks/anthropic_compat.ipynb: MiniMax via Anthropic-compatible API

Configuration:

  • Update .env.example with BASE_URL options and Vertex AI settings

XinyueZ and others added 3 commits February 26, 2026 21:39
…documentation

Add environment variable configuration for custom base URLs:
- OPENAI_BASE_URL: Support OpenAI-compatible APIs (vLLM, LocalAI, Ollama)
- ANTHROPIC_BASE_URL: Support Anthropic-compatible APIs (MiniMax)

Modified provider configs:
- src/celeste/providers/openai/audio/config.py
- src/celeste/providers/openai/images/config.py
- src/celeste/providers/openai/responses/config.py
- src/celeste/providers/openai/videos/config.py
- src/celeste/providers/anthropic/messages/config.py

Documentation updates:
- Add "Using Google Vertex AI" section with setup and usage examples
- Add "Using OpenAI/Anthropic-Compatible APIs" section with MiniMax example
- Include optional model registration guide to eliminate warnings
- Add links to example notebooks for both sections

Example notebooks:
- notebooks/vertexai-example.ipynb: Vertex AI with GoogleADC authentication
- notebooks/anthropic_compat.ipynb: MiniMax via Anthropic-compatible API

Configuration:
- Update .env.example with BASE_URL options and Vertex AI settings

Co-Authored-By: Claude Sonnet 4.5 <noreply@anthropic.com>
@Kamilbenkirane
Copy link
Member

Thanks for this PR @XinyueZ — you identified a real gap. Users should be able to point celeste at compatible APIs like MiniMax, vLLM, etc.

We're going to solve this with protocol= + base_url= parameters instead of env vars:

# Anthropic-compatible API (e.g., MiniMax) — uses chatcompletions protocol
await celeste.text.generate(
    "Explain quantum computing",
    model="MiniMax-M2.5",
    protocol="chatcompletions",
    base_url="https://api.minimax.io/anthropic",
    api_key="your-key",
)

# OpenAI-compatible API (e.g., vLLM, Ollama, LocalAI) — uses openresponses protocol
await celeste.text.generate(
    "Hello",
    model="lfm2:latest",
    protocol="openresponses",
    base_url="http://localhost:11434",
)

Why this approach over env vars:

  • Per-client, not process-global — you can talk to multiple compatible APIs in the same process
  • Explicit — no import-time side effects
  • Semantically correct — protocol= describes the wire format, not the provider identity

The base_url plumbing partially exists in the codebase already but isn't wired through to the protocol layer yet. We'll track the fix in a separate issue and link it here.

Regarding the Vertex AI docs — that was already shipped in #135 so we won't need those sections.

Does protocol= + base_url= cover everything you were trying to solve?

@Kamilbenkirane
Copy link
Member

Tracking the implementation in #230.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

2 participants