feat: add Azure OpenAI provider support#3675
Conversation
orzelig
left a comment
There was a problem hiding this comment.
Good Azure OpenAI integration! The provider auto-discovery works well and the documentation is comprehensive. Minor note: the global fetch wrapper approach works but could potentially have side effects - might be worth considering a per-provider fetch implementation in the future. Overall LGTM!
|
Lgtmi |
|
I found a bug, if you leave AZURE_OPENAI_RESOURCE_NAME as default and set AZURE_OPENAI_ENDPOINT, it won't set baseUrl |
|
The on-board flow didn't pick up the Azure-openAI config and it didn't config the auth-profile.json |
|
Thanks for review. There are two adjustment and enhance been completed.
|
|
This PR is awesome! 😍 |
|
CLAWDINATOR FIELD REPORT // PR Closure I am CLAWDINATOR — cybernetic crustacean, maintainer triage bot for OpenClaw. I was sent from the future to keep this repo shipping clean code. OPENCLAW IS IN FEATURE FREEZE. The maintainers are drowning in new gadgets, so every Need it reconsidered after the freeze? Come with me if you want to ship. Drop into #pr-thunderdome-dangerzone on Discord — READ THE TOPIC or risk immediate termination. Lay out the mission: problem, impact, test coverage. See you at the party, Richter. Stay br00tal. 🤖 This is an automated message from CLAWDINATOR, the OpenClaw maintainer bot. |
|
To be honest, "vibe coding" can significantly boost development efficiency. However, due to the nature of TypeScript's type system and the inconsistencies in third-party library types, merging contributor code makes it nearly impossible to guarantee code consistency. Essentially, this turns something simple into something complex. While a human reviewer can naturally maintain consistency, an AI lacks that level of control. As project complexity grows, it eventually reaches a point where no human can manage it, and the project slowly spirals out of control. |

This pull request adds first-class support for Azure OpenAI as a model provider, enabling the system to connect to Azure-hosted OpenAI models using environment variables or Docker Compose configuration. The main changes include a new provider implementation, updates to environment variable handling, and a dedicated Docker Compose file for Azure deployments.
Azure OpenAI Provider Integration:
Added a new module
azure-openai-provider.tsthat implements Azure OpenAI provider logic, including endpoint formatting, authentication, environment variable resolution, and support for multiple deployments. This module also provides a global fetch wrapper to ensure the requiredapi-versionparameter is always included in requests.Updated the provider discovery logic in
models-config.providers.tsto automatically detect Azure OpenAI configuration from environment variables or authentication profiles, and instantiate the Azure provider if detected. [1] [2]Environment & Configuration:
Extended the
.env.examplefile to include Azure OpenAI-specific variables such asAZURE_OPENAI_API_KEY,AZURE_OPENAI_RESOURCE_NAME,AZURE_OPENAI_DEPLOYMENT_NAME, and related settings.Updated the
ModelProviderConfigtype to support Azure-specific metadata, including the resource name and API version.Docker & Deployment:
docker-compose.azure.ymlfile that defines services for running the gateway, CLI, and a test utility with Azure OpenAI credentials, simplifying deployment in Azure environments.Authentication:
azure-openaias a recognized provider when looking up environment variables.