Skip to content

feat(dotAI): Dot AI LangChain4J - Azure OpenAI#35243

Draft
ihoffmann-dot wants to merge 12 commits intodot-ai-langchain-integrationfrom
dot-ai-langchain-azure-openai
Draft

feat(dotAI): Dot AI LangChain4J - Azure OpenAI#35243
ihoffmann-dot wants to merge 12 commits intodot-ai-langchain-integrationfrom
dot-ai-langchain-azure-openai

Conversation

@ihoffmann-dot
Copy link
Copy Markdown
Member

Summary

Adds Azure OpenAI as a supported provider in the LangChain4J integration layer.
Enterprise customers using Azure-hosted OpenAI deployments can now configure
dotAI without any code changes — only a providerConfig JSON update is required.

  • Add langchain4j-azure-open-ai dependency
  • Add azure_openai case to LangChain4jModelFactory switch
  • Implement buildAzureOpenAiChatModel, buildAzureOpenAiEmbeddingModel, buildAzureOpenAiImageModel
  • Pin Netty (4.1.118.Final) and Reactor (3.4.41) in BOM to resolve transitive version conflicts introduced by Azure SDK
  • Add 3 unit tests in LangChain4jModelFactoryTest

Configuration

{
  "chat": {
    "provider": "azure_openai",
    "apiKey": "...",
    "endpoint": "https://my-company.openai.azure.com/",
    "deploymentName": "gpt-4o",
    "apiVersion": "2024-02-01",
    "maxTokens": 16384,
    "temperature": 1.0
  },
  "embeddings": {
    "provider": "azure_openai",
    "apiKey": "...",
    "endpoint": "https://my-company.openai.azure.com/",
    "deploymentName": "text-embedding-ada-002",
    "apiVersion": "2024-02-01"
  },
  "image": {
    "provider": "azure_openai",
    "apiKey": "...",
    "endpoint": "https://my-company.openai.azure.com/",
    "deploymentName": "dall-e-3",
    "apiVersion": "2024-02-01"
  }
}

Notes

  • deploymentName is required for Azure (differs from model name). Falls back to model if not set.
  • apiVersion maps to Azure's serviceVersion. Recommended: 2024-02-01.
  • Netty/Reactor version pins are required for Azure SDK compatibility and do not affect existing behavior.

Related Issue

This PR fixes #35183
EPIC: dotAI Multi-Provider Support #33970

@github-actions
Copy link
Copy Markdown
Contributor

github-actions bot commented Apr 7, 2026

❌ Issue Linking Required

This PR could not be linked to an issue. All PRs must be linked to an issue for tracking purposes.

How to fix this:

Option 1: Add keyword to PR body (Recommended - auto-removes this comment)
Edit this PR description and add one of these lines:

  • This PR fixes #123 or Fixes: #123

  • This PR closes #123 or Closes: #123

  • This PR resolves #123 or Resolves: #123

  • Other supported keywords: fix, fixed, close, closed, resolve, resolved
    Option 2: Link via GitHub UI (Note: won't clear the failed check)

  1. Go to the PR → Development section (right sidebar)

  2. Click "Link issue" and select an existing issue

  3. Push a new commit or re-run the workflow to clear the failed check
    Option 3: Use branch naming
    Create a new branch with one of these patterns:

  • 123-feature-description (number at start)

  • issue-123-feature-description (issue-number at start)

  • feature-issue-123 (issue-number anywhere)

Why is this required?

Issue linking ensures proper tracking, documentation, and helps maintain project history. It connects your code changes to the problem they solve.---

This comment was automatically generated by the issue linking workflow

@claude
Copy link
Copy Markdown
Contributor

claude bot commented Apr 7, 2026

Claude finished @ihoffmann-dot's task in 1m 37s —— View job


Code Review

  • Read changed files and analyze PR

Issues

1. model globally required for Azure, but semantics say deploymentName is primary

LangChain4jModelFactory.java:93 calls requireNonBlank(config.model(), "model", modelType) before the switch — for all providers. For Azure, a user who provides only deploymentName (no model) gets a misleading validation failure. The PR description says "deploymentName is required for Azure… falls back to model if not set", but the code enforces the opposite: model is always required, and deploymentName is the optional override. The validation in validateAzureOpenAi should require deploymentName OR drop the global model check for Azure, and the builder's fallback direction should match.

Fix this →

2. maxCompletionTokens silently ignored for Azure chat/streaming models

The OpenAI builders at lines 142-146 and 156-160 check maxCompletionTokens first, then fall back to maxTokens. The Azure builders at lines 197 and 210 only check maxTokens. A user migrating a config from openai to azure_openai who uses maxCompletionTokens will have the setting silently dropped. Check whether AzureOpenAiChatModel.Builder supports maxCompletionTokens; if it does, add the same fallback chain; if not, document the limitation.

Fix this →

3. No test for buildStreamingChatModel with Azure

buildChatModel, buildEmbeddingModel, and buildImageModel all have Azure happy-path tests. buildStreamingChatModel does not. Given streaming is where connection failures and partial-response issues surface, this gap is worth closing.

4. build() signature will bloat with Phase 2 providers

Each new provider (Bedrock, Vertex AI) requires adding another Function<ProviderConfig, T> parameter to build() and touching all 4 public factory methods. A Map<String, Function<ProviderConfig, T>> dispatched by provider name would avoid this. Worth addressing before Phase 2 merges.


Minor

  • dotCMS/pom.xml: the inline comment `` restates the artifact ID. Drop it.

The BOM Netty/Reactor pins and the overall structure are fine.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Projects

Status: No status

Development

Successfully merging this pull request may close these issues.

1 participant