providers: add OpenAI compatible provider#185
Conversation
Summary ======= Created OpenAI compatible model and embedding providers. Details ===== The new provider is compatible with any OpenAI API endpoint. This includes OpenRouter, which has been subsumed as part of this change. Detection has been added for: - OpenAI - LM Studio - ollama - vllm This should allow for easy local model detection and usage.
|
@saem is attempting to deploy a commit to the rohitg00's projects Team on Vercel. A member of the Team first needs to authorize it. |
|
Important Review skippedDraft detected. Please check the settings in the CodeRabbit UI or the ⚙️ Run configurationConfiguration used: defaults Review profile: CHILL Plan: Pro Run ID: You can disable this status message by setting the Use the checkbox below for a quick retry:
✨ Finishing Touches🧪 Generate unit tests (beta)
Thanks for using CodeRabbit! It's free for OSS, and your support helps us grow. If you like it, consider giving us a shout-out. Comment |
|
FYI, I'm:
I would like to know if the general idea of the PR (an openai router + detection for various local runtimes) is something that the project would like to see developed? I haven't spent much time on it, so it's fine if you're not interested. |
also consolidated embedding provider tests
There was a problem hiding this comment.
This should either be restored or the baseUrl + api prefix needs to be handled properly
|
@rohitg00 apologies for the ping, just wondering if you're ok with this PR in spirit? If so I'll keep going otherwise I'll close it down. |
|
I think, This is already addressed in today's release by someone. |
Summary
Created OpenAI compatible model and embedding providers.
Details
The new provider is compatible with any OpenAI API endpoint. This includes OpenRouter, which has been subsumed as part of this change.
Detection has been added for:
This should allow for easy local model detection and usage.
OpenRouterNotes for Reviewers: