Skip to content

providers: add OpenAI compatible provider#185

Draft
saem wants to merge 2 commits intorohitg00:mainfrom
saem:openai-compatible-providers
Draft

providers: add OpenAI compatible provider#185
saem wants to merge 2 commits intorohitg00:mainfrom
saem:openai-compatible-providers

Conversation

@saem
Copy link
Copy Markdown

@saem saem commented Apr 22, 2026

Summary

Created OpenAI compatible model and embedding providers.

Details

The new provider is compatible with any OpenAI API endpoint. This includes OpenRouter, which has been subsumed as part of this change.

Detection has been added for:

  • OpenAI
  • LM Studio
  • ollama
  • vllm

This should allow for easy local model detection and usage.


  • undo the consolidation of OpenRouter

Notes for Reviewers:

  • I only have LM Studio setup so testing that locally
  • full disclosure, this was developed via Gemini
  • gemini went with two style changes that I'm not sure about:
    1. usage of mocks rather than process env vars + clean-up
    2. usage of constructor parameters for member declaration

Summary
=======

Created OpenAI compatible model and embedding providers.

Details
=====

The new provider is compatible with any OpenAI API endpoint. This
includes OpenRouter, which has been subsumed as part of this change.

Detection has been added for:

- OpenAI
- LM Studio
- ollama
- vllm

This should allow for easy local model detection and usage.
@vercel
Copy link
Copy Markdown

vercel Bot commented Apr 22, 2026

@saem is attempting to deploy a commit to the rohitg00's projects Team on Vercel.

A member of the Team first needs to authorize it.

@coderabbitai
Copy link
Copy Markdown

coderabbitai Bot commented Apr 22, 2026

Important

Review skipped

Draft detected.

Please check the settings in the CodeRabbit UI or the .coderabbit.yaml file in this repository. To trigger a single review, invoke the @coderabbitai review command.

⚙️ Run configuration

Configuration used: defaults

Review profile: CHILL

Plan: Pro

Run ID: d1aac088-07a9-441f-a683-8af89bbc6f1f

You can disable this status message by setting the reviews.review_status to false in the CodeRabbit configuration file.

Use the checkbox below for a quick retry:

  • 🔍 Trigger review
✨ Finishing Touches
🧪 Generate unit tests (beta)
  • Create PR with unit tests

Thanks for using CodeRabbit! It's free for OSS, and your support helps us grow. If you like it, consider giving us a shout-out.

❤️ Share

Comment @coderabbitai help to get the list of available commands and usage tips.

@saem
Copy link
Copy Markdown
Author

saem commented Apr 22, 2026

FYI, I'm:

  • getting rid of the mocks and increasing test covering
  • making sure I have LM Studio working end to end for me

I would like to know if the general idea of the PR (an openai router + detection for various local runtimes) is something that the project would like to see developed? I haven't spent much time on it, so it's fine if you're not interested.

also consolidated embedding provider tests
Copy link
Copy Markdown
Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

This should either be restored or the baseUrl + api prefix needs to be handled properly

@saem
Copy link
Copy Markdown
Author

saem commented Apr 22, 2026

@rohitg00 apologies for the ping, just wondering if you're ok with this PR in spirit? If so I'll keep going otherwise I'll close it down.

@rohitg00
Copy link
Copy Markdown
Owner

I think, This is already addressed in today's release by someone.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

2 participants