Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Add Azure OpenAI support #77

Open
fazedordecodigo opened this issue May 21, 2023 · 4 comments
Open

Add Azure OpenAI support #77

fazedordecodigo opened this issue May 21, 2023 · 4 comments

Comments

@fazedordecodigo
Copy link

Support for Azure OpenAI could be added, with this feature the data is not used for AI training, ideal for us to use at work, avoiding information leakage.

@shikelong
Copy link

+1.

@wwydmanski
Copy link

openai doesn't use API requests for AI training, but nevertheless Azure OpenAI could help in bringing down latency

@akira-cn
Copy link

I submit a pull request to add Azure OpenAI support: #96

@ishaan-jaff
Copy link

Hi @Delatorrea @shikelong @wwydmanski I believe we can help with this issue. I’m the maintainer of LiteLLM https://github.com/BerriAI/litellm

TLDR:
We allow you to use any LLM as a drop in replacement for gpt-3.5-turbo.
If you don't have access to the LLM you can use the LiteLLM proxy to make requests to the LLM

You can use LiteLLM in the following ways:

With your own API KEY:

This calls the provider API directly

from litellm import completion
import os
## set ENV variables 
os.environ["OPENAI_API_KEY"] = "your-key" # 
os.environ["COHERE_API_KEY"] = "your-key" # 

messages = [{ "content": "Hello, how are you?","role": "user"}]

# openai call
response = completion(model="gpt-3.5-turbo", messages=messages)

# cohere call
response = completion(model="command-nightly", messages=messages)

Using the LiteLLM Proxy with a LiteLLM Key

this is great if you don’t have access to claude but want to use the open source LiteLLM proxy to access claude

from litellm import completion
import os

## set ENV variables 
os.environ["OPENAI_API_KEY"] = "sk-litellm-5b46387675a944d2" # [OPTIONAL] replace with your openai key
os.environ["COHERE_API_KEY"] = "sk-litellm-5b46387675a944d2" # [OPTIONAL] replace with your cohere key

messages = [{ "content": "Hello, how are you?","role": "user"}]

# openai call
response = completion(model="gpt-3.5-turbo", messages=messages)

# cohere call
response = completion(model="command-nightly", messages=messages)

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

5 participants