-
Notifications
You must be signed in to change notification settings - Fork 36
feat: add Azure OpenAi LLM module #134
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Conversation
WalkthroughA new Azure OpenAI LLM provider module is being added to the dat-llms project. The changes include creating a new Maven module with a factory class that implements ChatModelFactory to handle Azure OpenAI configuration, model creation, and streaming variants. The factory is registered as a service provider and integrated into the CLI dependencies. Changes
Sequence DiagramsequenceDiagram
participant User
participant CLI
participant Factory as AzureOpenAiChatModelFactory
participant Config
participant Model as ChatModel/StreamingChatModel
User->>CLI: Request Azure OpenAI model
CLI->>Factory: create() or createStream()
Factory->>Config: Extract configuration
Config-->>Factory: endpoint, api-key, deployment-id, etc.
Factory->>Factory: validateConfigOptions()
alt Validation Success
Factory->>Model: Build AzureOpenAiChatModel or AzureOpenAiStreamingChatModel
Model-->>Factory: Model instance
Factory-->>CLI: Return ChatModel/StreamingChatModel
CLI-->>User: Ready to use
else Validation Failure
Factory-->>CLI: Throw exception
CLI-->>User: Configuration error
end
Estimated code review effort🎯 3 (Moderate) | ⏱️ ~20 minutes
Poem
Pre-merge checks and finishing touches❌ Failed checks (1 warning)
✅ Passed checks (2 passed)
✨ Finishing touches
🧪 Generate unit tests (beta)
Thanks for using CodeRabbit! It's free for OSS, and your support helps us grow. If you like it, consider giving us a shout-out. Comment |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Actionable comments posted: 0
🧹 Nitpick comments (1)
dat-llms/dat-llm-azure-openai/src/main/java/ai/dat/llm/azure/AzureOpenAiChatModelFactory.java (1)
115-120: Tightenresponse-formathandling to fail fast on invalid valuesRight now any value other than
"JSON"(case-insensitive) is silently treated asTEXT. That means typos like"jsno"or unsupported formats will degrade to text without any signal.Consider validating explicitly for
textandjsonand throwing on anything else, e.g.:- config.getOptional(RESPONSE_FORMAT).ifPresent(format -> { - ResponseFormat responseFormat = format.equalsIgnoreCase("JSON") - ? ResponseFormat.JSON : ResponseFormat.TEXT; - builder.responseFormat(responseFormat); - }); + config.getOptional(RESPONSE_FORMAT).ifPresent(format -> { + if ("json".equalsIgnoreCase(format)) { + builder.responseFormat(ResponseFormat.JSON); + } else if ("text".equalsIgnoreCase(format)) { + builder.responseFormat(ResponseFormat.TEXT); + } else { + throw new IllegalArgumentException( + "Unsupported value for '" + RESPONSE_FORMAT.key() + + "'. Supported values: 'text', 'json'"); + } + });Apply the same pattern in
createStream(...)to keep behavior consistent.Also applies to: 173-177, 209-213
📜 Review details
Configuration used: CodeRabbit UI
Review profile: CHILL
Plan: Pro
📒 Files selected for processing (5)
dat-cli/pom.xml(1 hunks)dat-llms/dat-llm-azure-openai/pom.xml(1 hunks)dat-llms/dat-llm-azure-openai/src/main/java/ai/dat/llm/azure/AzureOpenAiChatModelFactory.java(1 hunks)dat-llms/dat-llm-azure-openai/src/main/resources/META-INF/services/ai.dat.core.factories.ChatModelFactory(1 hunks)dat-llms/pom.xml(1 hunks)
🧰 Additional context used
🧬 Code graph analysis (1)
dat-llms/dat-llm-azure-openai/src/main/java/ai/dat/llm/azure/AzureOpenAiChatModelFactory.java (3)
dat-core/src/main/java/ai/dat/core/configuration/ConfigOption.java (1)
ConfigOption(18-174)dat-core/src/main/java/ai/dat/core/configuration/ConfigOptions.java (1)
ConfigOptions(42-237)dat-core/src/main/java/ai/dat/core/utils/FactoryUtil.java (1)
FactoryUtil(29-376)
🪛 OSV Scanner (2.2.4)
dat-llms/dat-llm-azure-openai/pom.xml
[HIGH] 1-1: io.netty:netty-codec-http2 4.1.118.Final: Netty affected by MadeYouReset HTTP/2 DDoS vulnerability
[HIGH] 1-1: net.minidev:json-smart 2.5.0: Netplex Json-smart Uncontrolled Recursion vulnerability
[HIGH] 1-1: org.apache.httpcomponents.client5:httpclient5 5.4.1: Apache HttpClient disables domain checks
[HIGH] 1-1: org.yaml:snakeyaml 1.33: SnakeYaml Constructor Deserialization Remote Code Execution
🔇 Additional comments (6)
dat-llms/pom.xml (1)
21-21: Azure LLM module registration looks consistentAdding
dat-llm-azure-openaito the<modules>list matches the existing pattern for other LLM modules and should integrate cleanly into the multi-module build.dat-llms/dat-llm-azure-openai/src/main/java/ai/dat/llm/azure/AzureOpenAiChatModelFactory.java (2)
24-24: Factory wiring and option exposure look solidThe factory identifier, required/optional option sets, and use of
FactoryUtil.validateFactoryOptionsplusvalidateConfigOptionsare consistent with the core factory pattern and give good coverage over Azure OpenAI configuration.Also applies to: 135-151
224-237: Numeric option validation is appropriateBounds checks for
temperature,top-p,max-retries, andmax-tokensare reasonable and will surface bad configs early viaIllegalArgumentException, which will be wrapped byFactoryUtilinto a clear error for callers.dat-llms/dat-llm-azure-openai/src/main/resources/META-INF/services/ai.dat.core.factories.ChatModelFactory (1)
1-1: ServiceLoader registration matches factory classThe service entry correctly points to
ai.dat.llm.azure.AzureOpenAiChatModelFactory, soChatModelFactoryManagershould be able to discover the Azure provider without additional wiring.dat-cli/pom.xml (1)
128-132: CLI dependency on Azure LLM is aligned with existing patternAdding
dat-llm-azure-openainext to the otherdat-llm-*dependencies with${project.version}keeps the CLI packaging consistent and ensures the new factory is available at runtime.dat-llms/dat-llm-azure-openai/pom.xml (1)
21-24: Build will not fail – property is defined in parent POMThe original concern is incorrect. The property
${langchain4j.vserion}is defined in the parent POM at line 69 as<langchain4j.vserion>1.4.0</langchain4j.vserion>, so Maven will successfully resolve the dependency. The build will not break.However, the typo in the property name ("vserion" instead of "version") is real and exists consistently across the parent POM definition and this file's usage. While this won't cause a build failure, it remains a code quality issue worth considering for future cleanup (renaming the property to the standard "version" suffix across all references).
Likely an incorrect or invalid review comment.
Summary by CodeRabbit