Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Add support for third-party hosted APIs #4440

Open
19h opened this issue May 14, 2024 · 2 comments
Open

Add support for third-party hosted APIs #4440

19h opened this issue May 14, 2024 · 2 comments
Labels
feature request New feature or request

Comments

@19h
Copy link

19h commented May 14, 2024

We've been coding against the Ollama API internally and eventually it hit me .. Ollama should be able to support third-party API providers, making it a de-facto gateway to LLMs.

For example, it would easily blur the lines between an OpenAI's assistant / user and a Gemini model / user conversation; it could transparently speak Cohere Command R+ completion-like while eloquently talking to Claude, too.

Might sound utterly off-topic, but think about it.

I implemented a hard-coded model into Ollama for local use so I can use unsupported, hosted LLMs in Cody for coding, and I feel like this could very well be a Modelfile-level thing with providers happily providing integrations, putting even more spotlight on Ollama while forcing LLM providers to be less fuzzy about their API integrations, given that the Modelfile spec is rigid enough.

@19h 19h added the feature request New feature or request label May 14, 2024
@ProjectMoon
Copy link

Isn't this something that LiteLLM can do?

@oldmanjk
Copy link

Isn't this something that LiteLLM can do?

It is, but combining efforts could be a good thing. This space needs more standardization, IMO

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
feature request New feature or request
Projects
None yet
Development

No branches or pull requests

3 participants