Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[Feature request] Azure API not well supported / documented #245

Closed
TomDarmon opened this issue Aug 26, 2023 · 10 comments
Closed

[Feature request] Azure API not well supported / documented #245

TomDarmon opened this issue Aug 26, 2023 · 10 comments

Comments

@TomDarmon
Copy link

TomDarmon commented Aug 26, 2023

Hello!

I am currently learning how to use a PR agent. I would like to use the Azure API instead of the OpenAI key using GitHub Actions.

There is no documentation available, but it seems that the code can be modified locally with minor adjustments to make it work.
It appears that with minimal effort, it could be added for the GitHub Actions deployment method.

I believe there are two changes that need to be made to make it possible:

  • The AiHandler needs to dynamically set the azure attribute based on a configured value to specify Azure deployment.
  • The appropriate API variables, such as open_ai.api_base and open_ai.api_version, should be passed as GitHub Actions secrets/configurations.
class AiHandler:
    """
    This class handles interactions with the OpenAI API for chat completions.
    It initializes the API key and other settings from a configuration file,
    and provides a method for performing chat completions using the OpenAI ChatCompletion API.
    """

    def __init__(self):
        """
        Initializes the OpenAI API key and other settings from a configuration file.
        Raises a ValueError if the OpenAI key is missing.
        """
        try:
            openai.api_key = get_settings().openai.key
            litellm.openai_key = get_settings().openai.key
            litellm.debugger = get_settings().config.litellm_debugger
            self.azure = False # <--- HARDCODED HERE
            ...

I'm new to the repository so I'm unsure if I'm missing something, if it's already possible to use the azure API using github actions I would be super grateful if you could document it 🙏

@mrT23 mrT23 added the help wanted Extra attention is needed label Aug 26, 2023
@okotek
Copy link
Contributor

okotek commented Aug 27, 2023

Hi @TomDarmon
See the discussion here regarding Azure support.
self.azure is not hard coded to False, it is activated when the API_TYPE flag is set to "azure".

            if get_settings().get("OPENAI.API_TYPE", None):
                if get_settings().openai.api_type == "azure":
                    self.azure = True
                    litellm.azure_key = get_settings().openai.key

Fill in the commented fields in the configuration file (.secrets.toml) or through environment variables.

[openai]
key = ""  # Acquire through https://platform.openai.com
#org = "<ORGANIZATION>"  # Optional, may be commented out.
# Uncomment the following for Azure OpenAI
#api_type = "azure"
#api_version = '2023-05-15'  # Check Azure documentation for the current API version
#api_base = ""  # The base URL for your Azure OpenAI resource. e.g. "https://<your resource name>.openai.azure.com"
#deployment_id = ""  # The deployment name you chose when you deployed the engine
#fallback_deployments = []  # For each fallback model specified in configuration.toml in the [config] section, specify the appropriate deployment_id

@okotek okotek removed the help wanted Extra attention is needed label Aug 27, 2023
@TomDarmon
Copy link
Author

TomDarmon commented Aug 27, 2023

Thanks for the reply @okotek !

So if I understand correctly, by just setting the env variables like this it should work ?
Without even setting a configurations.toml on the repo this will work ?

on:
  pull_request:
jobs:
  pr_agent_job:
    runs-on: ubuntu-latest
    permissions:
      issues: write
      pull-requests: write
      contents: write
    name: Run pr agent on every pull request, respond to user comments
    steps:
      - name: PR Agent action step
        id: pragent
        uses: Codium-ai/pr-agent@main
        env:
          API_TYPE: "azure"
          DEPLOYMENT_ID: "gpt-4" # custom name of the deployed model on Azure
          OPENAI_KEY: ${{ secrets.AZURE_API_KEY }}
          API_BASE: ${{ secrets.API_BASE }}
          API_VERSION: ${{ secrets.API_VERSION }}
          GITHUB_TOKEN: ${{ secrets.GITHUB_TOKEN }}

EDIT:

It looks like this code doesn't work, below is the traceback. The exact same credentials work using litellm.

Traceback (most recent call last):
  File "/app/pr_agent/algo/ai_handler.py", line 92, in chat_completion
    response = await acompletion(
  File "/usr/local/lib/python3.10/site-packages/litellm/main.py", line 54, in acompletion
    return await loop.run_in_executor(None, func)
  File "/usr/local/lib/python3.10/concurrent/futures/thread.py", line 58, in run
    result = self.fn(*self.args, **self.kwargs)
  File "/usr/local/lib/python3.10/site-packages/litellm/utils.py", line 449, in wrapper
    raise e
  File "/usr/local/lib/python3.10/site-packages/litellm/utils.py", line 419, in wrapper
    result = original_function(*args, **kwargs)
  File "/usr/local/lib/python3.10/site-packages/litellm/timeout.py", line 42, in wrapper
    result = future.result(timeout=local_timeout_duration)
  File "/usr/local/lib/python3.10/concurrent/futures/_base.py", line 458, in result
    return self.__get_result()
  File "/usr/local/lib/python3.10/concurrent/futures/_base.py", line 403, in __get_result
    raise self._exception
  File "/usr/local/lib/python3.10/site-packages/litellm/timeout.py", line 33, in async_func
    return func(*args, **kwargs)
  File "/usr/local/lib/python3.10/site-packages/litellm/main.py", line 754, in completion
    raise exception_type(
  File "/usr/local/lib/python3.10/site-packages/litellm/utils.py", line 1319, in exception_type
    raise e
  File "/usr/local/lib/python3.10/site-packages/litellm/utils.py", line 1184, in exception_type
    raise original_exception
  File "/usr/local/lib/python3.10/site-packages/litellm/main.py", line 241, in completion
    response = openai.ChatCompletion.create(
  File "/usr/local/lib/python3.10/site-packages/openai/api_resources/chat_completion.py", line 25, in create
    return super().create(*args, **kwargs)
  File "/usr/local/lib/python3.10/site-packages/openai/api_resources/abstract/engine_api_resource.py", line 153, in create
    response, _, api_key = requestor.request(
  File "/usr/local/lib/python3.10/site-packages/openai/api_requestor.py", line 298, in request
    resp, got_stream = self._interpret_response(result, stream)
  File "/usr/local/lib/python3.10/site-packages/openai/api_requestor.py", line 700, in _interpret_response
    self._interpret_response_line(
  File "/usr/local/lib/python3.10/site-packages/openai/api_requestor.py", line 763, in _interpret_response_line
    raise self.handle_error_response(
openai.error.AuthenticationError: Incorrect API key provided: ****************************. You can find your API key at https://platform.openai.com/account/api-keys.

An interesting observation is that if I use model="gpt4" instead of model="azure/gpt4" I get the same error regarding the incorrect api key. Maybe the env variable API_TYPE: "azure" is not the way to go to specify to use azure.

Do you have any idea what I am doing wrong ?

@okotek
Copy link
Contributor

okotek commented Aug 28, 2023

We use Dynaconf for configuration, the syntax for environment variables is <SECTION>.<KEY>, so you should add "OPENAI." for the OpenAI variables, for example:
OPENAI.API_TYPE: "azure"

@ishaan-jaff
Copy link

@TomDarmon I'm the maintainer of LiteLLM - was this solved for you ?

Here's how azure calls work with litellm

from litellm import completion
response= completion('your-azure-deployment-name', messages, custom_llm_provider="azure")

Make sure your .env contains the following variables:
os.environ['AZURE_API_KEY'],os.environ['AZURE_API_BASE'],os.environ['AZURE_API_VERSION']

Here's a link to our docs
https://docs.litellm.ai/docs/completion/supported#azure-openai-chat-completion-models

Here's my calendly: happy to hop on a call and help out :https://calendly.com/ishaan-berri/30min

@mrT23
Copy link
Collaborator

mrT23 commented Sep 5, 2023

@krrishdholakia
Copy link
Contributor

@mrT23 @TomDarmon i'll make a PR to improve docs on this

@krrishdholakia
Copy link
Contributor

Made a PR for this - #276

@krrishdholakia
Copy link
Contributor

Can we close this now as the PR has been merged @TomDarmon @mrT23

@mrT23
Copy link
Collaborator

mrT23 commented Sep 12, 2023

/similar_issue

@github-actions
Copy link
Contributor

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

5 participants