Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Using Azure and OpenAI at the same time #411

Closed
rmilkowski opened this issue Apr 23, 2023 · 5 comments
Closed

Using Azure and OpenAI at the same time #411

rmilkowski opened this issue Apr 23, 2023 · 5 comments
Labels
bug Something isn't working

Comments

@rmilkowski
Copy link

Describe the bug

Currently it seems there is no strightforward way to be able to send requests to both OpenAI and Azure from same python program. Ideally this should be possible:

openai = OpenAI(key=key)
azureai = OpenAI(key=key, api_base=xxx, ....)

While currently given key, api_base, etc. are global variables to the module, the above won't work.

To Reproduce

Code snippets

No response

OS

Linux

Python version

Python 3

Library version

latest

@rmilkowski rmilkowski added the bug Something isn't working label Apr 23, 2023
artdent added a commit to artdent/langflow that referenced this issue Jun 5, 2023
There are still some limitations due to underlying langchain and
openai API limitations, e.g. langchain-ai/langchain#3769 and
openai/openai-python#411. Notably, you can't use the Azure and
non-Azure node types in the same server, since there's global openai
configuration needed to choose between the two. So it's probably best
to still leave the Azure node types commented out in the default
config. But with this PR, if you uncomment those nodes and start the
server with OPENAI_API_TYPE=azure, you will have working Azure nodes.
artdent added a commit to artdent/langflow that referenced this issue Jun 5, 2023
There are still some rough edges due to underlying langchain and
openai API limitations, e.g. langchain-ai/langchain#3769 and
openai/openai-python#411. Notably, you can't use the Azure and
non-Azure node types in the same server, since there's global openai
configuration needed to choose between the two. So it's probably best
to still leave the Azure node types commented out in the default
config. But with this PR, if you uncomment those nodes and start the
server with OPENAI_API_TYPE=azure, you will have working Azure nodes.
@ellis-jc
Copy link

ellis-jc commented Jul 7, 2023

+1 on this.

Right now we have different models available on each side and using the two providers in tandem is a huge pain

@fabswt
Copy link

fabswt commented Jul 18, 2023

This is indeed annoying… I need Whisper from OpenAI but ChatCompletion from Azure.

@ellis-jc How do you guys handle it?

FYI, I posted this on StackOverflow: How to use the Python openai client with both Azure and OpenAI at the same time?

@sevenjay
Copy link

sevenjay commented Aug 4, 2023

+1 on this.
We run both and a load balance in front of them, need this teature, too.

@Yaakov-Belch
Copy link

According to this response, there is a non-documented feature: Many API features accept named arguments such as:

import openai
openai.ChatCompletion.create(model=gpt_model_config.model_engine,
                             api_key=SECRET
                             api_base=BASE,
                             api_type=TYPE,
                             api_version=VERSION,
                           ...)

Setting the api_key and api_base per request may address your issue.

@rmilkowski
Copy link
Author

According to this response, there is a non-documented feature: Many API features accept named arguments such as:

import openai
openai.ChatCompletion.create(model=gpt_model_config.model_engine,
                             api_key=SECRET
                             api_base=BASE,
                             api_type=TYPE,
                             api_version=VERSION,
                           ...)

Setting the api_key and api_base per request may address your issue.

I just tried it with openai.Image.create() and it works.
Thank you.

btw: looks like the upcoming 1.0.0 module version will finally support client instantiation. See #631

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
bug Something isn't working
Projects
None yet
Development

No branches or pull requests

5 participants