Skip to content

Conversation

@junjiem
Copy link
Collaborator

@junjiem junjiem commented Nov 14, 2025

Summary by CodeRabbit

  • New Features
    • Added Azure OpenAI integration as a supported language model provider. Users can now configure and utilize Azure OpenAI for chat operations with customizable parameters including endpoint configuration, API authentication, temperature settings, token limits, and response formatting options. Both standard and streaming chat modes are supported.

@coderabbitai
Copy link
Contributor

coderabbitai bot commented Nov 14, 2025

Walkthrough

A new Azure OpenAI LLM provider module is being added to the dat-llms project. The changes include creating a new Maven module with a factory class that implements ChatModelFactory to handle Azure OpenAI configuration, model creation, and streaming variants. The factory is registered as a service provider and integrated into the CLI dependencies.

Changes

Cohort / File(s) Change Summary
Module Setup
dat-llms/pom.xml, dat-llms/dat-llm-azure-openai/pom.xml, dat-cli/pom.xml
Added new dat-llm-azure-openai module entry to parent POM and created module descriptor with dependencies on dat-core and langchain4j-azure-open-ai. Added module dependency to CLI.
Factory Implementation
dat-llms/dat-llm-azure-openai/src/main/java/ai/dat/llm/azure/AzureOpenAiChatModelFactory.java
New factory class implementing ChatModelFactory with 16 configuration options (endpoint, deployment-id, api-key, temperature, etc.) and creation logic for both synchronous and streaming chat models. Includes validation for numeric parameters.
Service Registration
dat-llms/dat-llm-azure-openai/src/main/resources/META-INF/services/ai.dat.core.factories.ChatModelFactory
Registered AzureOpenAiChatModelFactory as a service provider for chat model factory discovery.

Sequence Diagram

sequenceDiagram
    participant User
    participant CLI
    participant Factory as AzureOpenAiChatModelFactory
    participant Config
    participant Model as ChatModel/StreamingChatModel
    
    User->>CLI: Request Azure OpenAI model
    CLI->>Factory: create() or createStream()
    Factory->>Config: Extract configuration
    Config-->>Factory: endpoint, api-key, deployment-id, etc.
    Factory->>Factory: validateConfigOptions()
    alt Validation Success
        Factory->>Model: Build AzureOpenAiChatModel or AzureOpenAiStreamingChatModel
        Model-->>Factory: Model instance
        Factory-->>CLI: Return ChatModel/StreamingChatModel
        CLI-->>User: Ready to use
    else Validation Failure
        Factory-->>CLI: Throw exception
        CLI-->>User: Configuration error
    end
Loading

Estimated code review effort

🎯 3 (Moderate) | ⏱️ ~20 minutes

  • AzureOpenAiChatModelFactory.java: Review the factory class logic for configuration validation, parameter mapping (especially response-format to ResponseFormat), and proper builder construction for both sync and streaming paths. Ensure all 16 config options are correctly handled and validated.
  • pom.xml files: Verify module hierarchy, parent reference, dependency versions, and that langchain4j-azure-open-ai version is correctly sourced from the ${langchain4j.vserion} property.
  • Service registration: Confirm the service provider file is correctly placed and contains the correct factory class name for SPI discovery.

Poem

A rabbit hops with Azure dreams,
New LLMs now stream and gleam,
OpenAI joins the family fold,
Factory magic, configuration told,
🐰 Chat models bloom, futures unfold! ✨

Pre-merge checks and finishing touches

❌ Failed checks (1 warning)
Check name Status Explanation Resolution
Docstring Coverage ⚠️ Warning Docstring coverage is 0.00% which is insufficient. The required threshold is 80.00%. You can run @coderabbitai generate docstrings to improve docstring coverage.
✅ Passed checks (2 passed)
Check name Status Explanation
Description Check ✅ Passed Check skipped - CodeRabbit’s high-level summary is enabled.
Title check ✅ Passed The pull request title 'feat: add Azure OpenAi LLM module' accurately reflects the main change: introducing a new Azure OpenAI LLM module with factory implementation, configuration options, and Maven module setup across multiple files.
✨ Finishing touches
  • 📝 Generate docstrings
🧪 Generate unit tests (beta)
  • Create PR with unit tests
  • Post copyable unit tests in a comment
  • Commit unit tests in branch feat/azure_openai_llm

Thanks for using CodeRabbit! It's free for OSS, and your support helps us grow. If you like it, consider giving us a shout-out.

❤️ Share

Comment @coderabbitai help to get the list of available commands and usage tips.

Copy link
Contributor

@coderabbitai coderabbitai bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Actionable comments posted: 0

🧹 Nitpick comments (1)
dat-llms/dat-llm-azure-openai/src/main/java/ai/dat/llm/azure/AzureOpenAiChatModelFactory.java (1)

115-120: Tighten response-format handling to fail fast on invalid values

Right now any value other than "JSON" (case-insensitive) is silently treated as TEXT. That means typos like "jsno" or unsupported formats will degrade to text without any signal.

Consider validating explicitly for text and json and throwing on anything else, e.g.:

-        config.getOptional(RESPONSE_FORMAT).ifPresent(format -> {
-            ResponseFormat responseFormat = format.equalsIgnoreCase("JSON")
-                    ? ResponseFormat.JSON : ResponseFormat.TEXT;
-            builder.responseFormat(responseFormat);
-        });
+        config.getOptional(RESPONSE_FORMAT).ifPresent(format -> {
+            if ("json".equalsIgnoreCase(format)) {
+                builder.responseFormat(ResponseFormat.JSON);
+            } else if ("text".equalsIgnoreCase(format)) {
+                builder.responseFormat(ResponseFormat.TEXT);
+            } else {
+                throw new IllegalArgumentException(
+                        "Unsupported value for '" + RESPONSE_FORMAT.key()
+                                + "'. Supported values: 'text', 'json'");
+            }
+        });

Apply the same pattern in createStream(...) to keep behavior consistent.

Also applies to: 173-177, 209-213

📜 Review details

Configuration used: CodeRabbit UI

Review profile: CHILL

Plan: Pro

📥 Commits

Reviewing files that changed from the base of the PR and between 7325899 and fda9ea3.

📒 Files selected for processing (5)
  • dat-cli/pom.xml (1 hunks)
  • dat-llms/dat-llm-azure-openai/pom.xml (1 hunks)
  • dat-llms/dat-llm-azure-openai/src/main/java/ai/dat/llm/azure/AzureOpenAiChatModelFactory.java (1 hunks)
  • dat-llms/dat-llm-azure-openai/src/main/resources/META-INF/services/ai.dat.core.factories.ChatModelFactory (1 hunks)
  • dat-llms/pom.xml (1 hunks)
🧰 Additional context used
🧬 Code graph analysis (1)
dat-llms/dat-llm-azure-openai/src/main/java/ai/dat/llm/azure/AzureOpenAiChatModelFactory.java (3)
dat-core/src/main/java/ai/dat/core/configuration/ConfigOption.java (1)
  • ConfigOption (18-174)
dat-core/src/main/java/ai/dat/core/configuration/ConfigOptions.java (1)
  • ConfigOptions (42-237)
dat-core/src/main/java/ai/dat/core/utils/FactoryUtil.java (1)
  • FactoryUtil (29-376)
🪛 OSV Scanner (2.2.4)
dat-llms/dat-llm-azure-openai/pom.xml

[HIGH] 1-1: io.netty:netty-codec-http2 4.1.118.Final: Netty affected by MadeYouReset HTTP/2 DDoS vulnerability

(GHSA-prj3-ccx8-p6x4)


[HIGH] 1-1: net.minidev:json-smart 2.5.0: Netplex Json-smart Uncontrolled Recursion vulnerability

(GHSA-pq2g-wx69-c263)


[HIGH] 1-1: org.apache.httpcomponents.client5:httpclient5 5.4.1: Apache HttpClient disables domain checks

(GHSA-73m2-qfq3-56cx)


[HIGH] 1-1: org.yaml:snakeyaml 1.33: SnakeYaml Constructor Deserialization Remote Code Execution

(GHSA-mjmj-j48q-9wg2)

🔇 Additional comments (6)
dat-llms/pom.xml (1)

21-21: Azure LLM module registration looks consistent

Adding dat-llm-azure-openai to the <modules> list matches the existing pattern for other LLM modules and should integrate cleanly into the multi-module build.

dat-llms/dat-llm-azure-openai/src/main/java/ai/dat/llm/azure/AzureOpenAiChatModelFactory.java (2)

24-24: Factory wiring and option exposure look solid

The factory identifier, required/optional option sets, and use of FactoryUtil.validateFactoryOptions plus validateConfigOptions are consistent with the core factory pattern and give good coverage over Azure OpenAI configuration.

Also applies to: 135-151


224-237: Numeric option validation is appropriate

Bounds checks for temperature, top-p, max-retries, and max-tokens are reasonable and will surface bad configs early via IllegalArgumentException, which will be wrapped by FactoryUtil into a clear error for callers.

dat-llms/dat-llm-azure-openai/src/main/resources/META-INF/services/ai.dat.core.factories.ChatModelFactory (1)

1-1: ServiceLoader registration matches factory class

The service entry correctly points to ai.dat.llm.azure.AzureOpenAiChatModelFactory, so ChatModelFactoryManager should be able to discover the Azure provider without additional wiring.

dat-cli/pom.xml (1)

128-132: CLI dependency on Azure LLM is aligned with existing pattern

Adding dat-llm-azure-openai next to the other dat-llm-* dependencies with ${project.version} keeps the CLI packaging consistent and ensures the new factory is available at runtime.

dat-llms/dat-llm-azure-openai/pom.xml (1)

21-24: Build will not fail – property is defined in parent POM

The original concern is incorrect. The property ${langchain4j.vserion} is defined in the parent POM at line 69 as <langchain4j.vserion>1.4.0</langchain4j.vserion>, so Maven will successfully resolve the dependency. The build will not break.

However, the typo in the property name ("vserion" instead of "version") is real and exists consistently across the parent POM definition and this file's usage. While this won't cause a build failure, it remains a code quality issue worth considering for future cleanup (renaming the property to the standard "version" suffix across all references).

Likely an incorrect or invalid review comment.

@junjiem junjiem merged commit 001f4e9 into main Nov 14, 2025
1 check passed
@junjiem junjiem deleted the feat/azure_openai_llm branch November 14, 2025 05:55
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

2 participants