-
Notifications
You must be signed in to change notification settings - Fork 1.3k
Description
Description
The OpenRouter API is mostly but not completely OpenAI Chat Completions-compatible, so using OpenAIChatModel
with OpenRouterProvider
mostly works, but not entirely:
- OpenRouter uses non-compatible finish reason #2844
- Handle
error
response from OpenRouter as exception instead of validation failure #2323
And some OpenRouter-only features cannot currently be implemented easily:
- Store OpenRouter provider metadata in ModelResponse vendor details #1849
- Support prefill by ending history with ModelResponse #2778
- Can I get thinking part from openrouter provider using google/gemini-2.5-pro? #2999
pydantic-ai/pydantic_ai_slim/pydantic_ai/models/openai.py
Lines 521 to 522 in aca70eb
# NOTE: We don't currently handle OpenRouter `reasoning_details`: # - https://openrouter.ai/docs/use-cases/reasoning-tokens#preserving-reasoning-blocks
- Add
OpenRouterModel
#1870 (comment)
Two PR to implement an OpenRouterModel
was recently submitted, but it involved too much duplication with OpenAIChatModel
for my taste, instead of subclassing and only changing what's necessary: #1870 #2409. However, the Groq and HuggingFace models basically copy-paste OpenAIChatModel
as well, so I think the benefit outweighs the cost. I've reopened the most recent of those 2 PRs.
References
No response