Azure OpenAI Chat Provider plugin for LLM Workflow Engine
Access to Azure OpenAI Chat models.
You'll need to create a service resource and deploy models see here.
You'll also need a key and endpoint for the resource, see here
Install the latest version of this software directly from github with pip:
pip install git+https://github.com/llm-workflow-engine/lwe-plugin-provider-azure-openai-chat
Install the latest version of this software directly from git:
git clone https://github.com/llm-workflow-engine/lwe-plugin-provider-azure-openai-chat.git
Install the development package:
cd lwe-plugin-provider-azure-openai-chat
pip install -e .
The following provider variables/environment variables need to be set:
export AZURE_OPENAI_API_KEY=[key]
export AZURE_ENDPOINT=[endpoint]
export AZURE_OPENAI_API_VERSION=2024-02-01
Add the following to config.yaml
in your profile:
plugins:
enabled:
- provider_azure_openai_chat
# Any other plugins you want enabled...
From a running LWE shell:
/provider azure_openai_chat
/model deployment_name gpt-35-turbo
# Instead of environment variables, these values can also be set directly on the model:
/model openai_api_key [key]
/model openai_endpoint [endpoint]
/model openai_api_version 2023-05-15