Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Azure OpenAI #1042

Closed
marcofiocco opened this issue Apr 12, 2024 · 7 comments
Closed

Azure OpenAI #1042

marcofiocco opened this issue Apr 12, 2024 · 7 comments
Labels
enhancement New feature or request severity:low Minor issues, affecting single user

Comments

@marcofiocco
Copy link

marcofiocco commented Apr 12, 2024

What problem or use case are you trying to solve?
In my company we have an Azure App sandbox to use OpenAI.
The code we use is:

from azure.identity import EnvironmentCredential, get_bearer_token_provider
from langchain_openai import AzureChatOpenAI
from langchain.schema import HumanMessage, SystemMessage
from dotenv import load_dotenv

# Provide a .env file in your working directory with the following contents.
# AZURE_TENANT_ID='<TENANT_ID>'
# AZURE_CLIENT_ID='<CLIENT_ID>'
# AZURE_CLIENT_SECRET='<CLIENT_SECRET>'
load_dotenv()


token_provider = get_bearer_token_provider(EnvironmentCredential(), "https://cognitiveservices.azure.com/.default")

llm = AzureChatOpenAI(
    azure_endpoint="https://cog-sandbox-dev-eastus2-001.openai.azure.com/",
    api_version="2024-02-01",
    azure_ad_token_provider=token_provider,
    deployment_name="gpt-35-turbo-blue",
)


# Execute
def get_completion(prompt):
    sys_msg = SystemMessage(content="You are a friendly assistant.")
    msg = HumanMessage(content=prompt)
    response = llm.invoke(input=[sys_msg, msg])
    return response.content

I have the TENANT_ID, CLIENT_ID, CLIENT_SECRET, but I cannot find any way to use them in OpenDevin.

Describe the UX of the solution you'd like
Allow me to specify them in a .env and somehow let me select that backend with a script?

Do you have thoughts on the technical implementation?
I've tried to integrate this code https://docs.litellm.ai/docs/secret#azure-key-vault into config.py without success. I mean litellm.get_secret("my-secret-id") returns empty string. Also I think OpenDevin should be able to call get_secret regularly because the secret might expire?

Describe alternatives you've considered

Additional context

@marcofiocco marcofiocco added the enhancement New feature or request label Apr 12, 2024
@rbren
Copy link
Collaborator

rbren commented Apr 12, 2024

@rbren rbren added the severity:low Minor issues, affecting single user label Apr 12, 2024
@marcofiocco
Copy link
Author

@rbren yes, I cannot set LLM_API_KEY because we need to setup the token_provider as seen above which in turn requires TENANT_ID, CLIENT_ID, CLIENT_SECRET

@enyst
Copy link
Collaborator

enyst commented Apr 13, 2024

@marcofiocco
I'm not sure how to solve this off-hand, but we can find a solution. I don't currently have Azure, though, so I may need your help here.

Allow me to specify them in a .env and somehow let me select that backend with a script?

OK, but the first part is already there: when litellm is run via opendevin, it already has access to the vars in .env. So if you just create the .env file in opendevin with your settings, litellm should have them.

OpenDevin should be able to call get_secret regularly because the secret might expire?

I don't think so, I think that's what litellm does...

You say you tried to integrate it in config.py, but please help me understand here: can you call Azure successfully with this AzureChatOpenAI object or anything else,

  1. in a standalone script, without litellm
  2. with litellm, but not opendevin? If we can make it work with litellm, then we'll know what we need to do to support this use case.

@marcofiocco
Copy link
Author

@enyst
Yes I can make it work in a standalone script, without litellm. The updated code is:

from openai import AzureOpenAI
from azure.identity import EnvironmentCredential, get_bearer_token_provider

# Provide a .env file in your working directory with the following contents.
# AZURE_TENANT_ID='<TENANT_ID>'
# AZURE_CLIENT_ID='<CLIENT_ID>'
# AZURE_CLIENT_SECRET='<CLIENT_SECRET>'
from dotenv import load_dotenv
load_dotenv()
token_provider = get_bearer_token_provider(EnvironmentCredential(), "https://cognitiveservices.azure.com/.default")

def new_message_object(role, content):
    return {"role": role, "content": content}


def read_txt_file(filename):
    file = open(filename)
    return file.read()


def run_completion(summary, messages, llm: AzureOpenAI):
    print(summary)
    response = llm.chat.completions.create(messages=messages, model="gpt-3.5-turbo")
    print(response.choices[0].message.content)
    input("\nPress Enter to continue...\n\n")


if __name__ == "__main__":
    textfile = read_txt_file("examples/text-example.txt")
    llm = AzureOpenAI(
        azure_endpoint="https://cog-sandbox-dev-eastus2-001.openai.azure.com/",
        api_version="2024-02-01",
        azure_ad_token_provider=token_provider,
        azure_deployment="gpt-35-turbo-blue"
    )

    prompt1 = [
        new_message_object(
            "system",
            """Your Task is to generate a short summary of an article. Summarize the article below,
     delimited by triple backticks, in at most 100 words.""",
        ),
        new_message_object("user", f"Article: ```{textfile}```"),
    ]
    run_completion(summary="Task: Summarize", messages=prompt1, llm=llm)

Separately I've tried litellm to setup a key vault:

### Instantiate Azure Key Vault Client ###

# Set your Azure Key Vault URI
KVUri = os.getenv('AZURE_KEY_VAULT_URI') # using https://cognitiveservices.azure.com/.default

# Set your Azure AD application/client ID, client secret, and tenant ID - create an application with permission to call your key vault
client_id = os.getenv('AZURE_CLIENT_ID')
client_secret = os.getenv('AZURE_CLIENT_SECRET')
tenant_id = os.getenv('AZURE_TENANT_ID')

# Initialize the ClientSecretCredential
credential = ClientSecretCredential(
    client_id=client_id, client_secret=client_secret, tenant_id=tenant_id)

# Create the SecretClient using the credential
client = SecretClient(vault_url=KVUri, credential=credential)

### Connect to LiteLLM ###
litellm.secret_manager = client

test_key = litellm.get_secret('what to put here???')

but test_key results empty, I mean I also have no idea what to put as an argument for litellm.get_secret(), the former block of code does not need such an argument.

@enyst
Copy link
Collaborator

enyst commented Apr 14, 2024

This parameter, ideally:

test_token = litellm.get_secret("AZURE_AD_TOKEN")

@SmartManoj
Copy link
Contributor

Directly set AZURE_AD_TOKEN in your env

@neubig
Copy link
Contributor

neubig commented May 10, 2024

Should be fixed by the comment above!

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
enhancement New feature or request severity:low Minor issues, affecting single user
Projects
None yet
Development

No branches or pull requests

5 participants