Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Custom Endpoint (not OpenAI) #43

Open
lucouto opened this issue Sep 10, 2023 · 2 comments
Open

Custom Endpoint (not OpenAI) #43

lucouto opened this issue Sep 10, 2023 · 2 comments
Labels
enhancement New feature or request

Comments

@lucouto
Copy link

lucouto commented Sep 10, 2023

Describe the feature you'd like to request

I've an OpenAI service at MS Azure and I'd like to use it as AI provider instead of OpenAI directly.

Describe the solution you'd like

Into the Connected accounts (OpenAI integration) we could have this option:

Endpoint url: https://testeai.openai.azure.com/ (for example)
Key: ******

Describe alternatives you've considered

Nono

@lucouto lucouto added the enhancement New feature or request label Sep 10, 2023
@tcitworld
Copy link
Member

Isn't the LocalAI URL endpoint appropriate for that as well?

@ishaan-jaff
Copy link

Hi @lucouto @tcitworld I believe we can help with this issue. I’m the maintainer of LiteLLM https://github.com/BerriAI/litellm

TLDR:
We allow you to use any LLM as a drop in replacement for gpt-3.5-turbo.
If you don't have access to the LLM you can use the LiteLLM proxy to make requests to the LLM

You can use LiteLLM in the following ways:

With your own API KEY:

This calls the provider API directly

from litellm import completion
import os
## set ENV variables 
os.environ["OPENAI_API_KEY"] = "your-key" # 
os.environ["COHERE_API_KEY"] = "your-key" # 

messages = [{ "content": "Hello, how are you?","role": "user"}]

# openai call
response = completion(model="gpt-3.5-turbo", messages=messages)

# cohere call
response = completion(model="command-nightly", messages=messages)

Using the LiteLLM Proxy with a LiteLLM Key

this is great if you don’t have access to claude but want to use the open source LiteLLM proxy to access claude

from litellm import completion
import os

## set ENV variables 
os.environ["OPENAI_API_KEY"] = "sk-litellm-5b46387675a944d2" # [OPTIONAL] replace with your openai key
os.environ["COHERE_API_KEY"] = "sk-litellm-5b46387675a944d2" # [OPTIONAL] replace with your cohere key

messages = [{ "content": "Hello, how are you?","role": "user"}]

# openai call
response = completion(model="gpt-3.5-turbo", messages=messages)

# cohere call
response = completion(model="command-nightly", messages=messages)

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
enhancement New feature or request
Projects
None yet
Development

No branches or pull requests

3 participants