Skip to content

feat(azure): forward credential_scopes to Azure AI Inference client#5661

Merged
mattatcha merged 2 commits intomainfrom
matcha/azure-credential-scopes
Apr 29, 2026
Merged

feat(azure): forward credential_scopes to Azure AI Inference client#5661
mattatcha merged 2 commits intomainfrom
matcha/azure-credential-scopes

Conversation

@mattatcha
Copy link
Copy Markdown
Collaborator

@mattatcha mattatcha commented Apr 29, 2026

Summary

Adds a credential_scopes field to the native Azure AI Inference provider plus a matching AZURE_CREDENTIAL_SCOPES env var (comma-separated). The value is forwarded to ChatCompletionsClient / AsyncChatCompletionsClient when set, letting keyless / Entra-based callers target a specific Azure AD audience (e.g. https://cognitiveservices.azure.com/.default) without subclassing the provider. Matches the upstream azure.ai.inference SDK kwarg of the same name.

Key Changes

  • New credential_scopes: list[str] | None field on AzureCompletion
  • _normalize_azure_fields reads AZURE_CREDENTIAL_SCOPES (comma-separated) when not provided explicitly
  • _make_client_kwargs re-reads the env on lazy build (matching the existing AZURE_API_KEY / AZURE_ENDPOINT lazy pattern) and forwards credential_scopes to the SDK only when set
  • to_config_dict round-trips the field
  • 6 new unit tests cover explicit arg, default omission, env var, env-overridden-by-arg, lazy env read, and to_config_dict round-trip; full Azure module: 68 passed, 0 failed

Note

Medium Risk
Touches Azure authentication/client initialization by altering token audience configuration, which can affect connectivity for keyless Entra flows. Changes are gated (only applied when scopes are provided) and covered by targeted tests.

Overview
Adds a new optional credential_scopes field to AzureCompletion, populated from either a constructor arg or the comma-separated AZURE_CREDENTIAL_SCOPES env var.

Updates lazy client construction to re-read scopes at build time and only forward credential_scopes into the Azure AI Inference client kwargs when set; to_config_dict now round-trips the value. Includes new unit tests covering explicit vs default behavior, env parsing/precedence, lazy env reads, and config serialization.

Reviewed by Cursor Bugbot for commit 68a64e9. Bugbot is set up for automated code reviews on this repo. Configure here.

Adds a credential_scopes field to the native Azure AI Inference
provider and a matching AZURE_CREDENTIAL_SCOPES env var
(comma-separated). The value is forwarded to ChatCompletionsClient /
AsyncChatCompletionsClient when set, letting keyless / Entra-based
callers target a specific Azure AD audience (e.g.
https://cognitiveservices.azure.com/.default) without subclassing the
provider. Matches the upstream azure.ai.inference SDK kwarg of the
same name.

Lazy build re-reads the env var so an LLM constructed at module
import (before deployment env vars are set) still picks up scopes —
same pattern as the existing AZURE_API_KEY / AZURE_ENDPOINT lazy
reads. to_config_dict round-trips the field.
Address review feedback:
- Move os.getenv into the helper so AZURE_CREDENTIAL_SCOPES appears once
- Match the surrounding api_key/endpoint `or` style in the validator
- Drop the list() defensive copy in to_config_dict — every other field
  in that method (and the base class's `stop`) is assigned by reference
@mattatcha mattatcha marked this pull request as ready for review April 29, 2026 21:11
@crewAIInc crewAIInc deleted a comment from linear Bot Apr 29, 2026
@mattatcha mattatcha merged commit c7f0104 into main Apr 29, 2026
55 checks passed
@mattatcha mattatcha deleted the matcha/azure-credential-scopes branch April 29, 2026 21:52
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

Projects

None yet

Development

Successfully merging this pull request may close these issues.

2 participants