-
Notifications
You must be signed in to change notification settings - Fork 72
[feature][java][chatmodels] Add Azure AI chat model integration #290
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Conversation
wenjin272
left a comment
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Hi, @tsaiggo, thanks for your contribution. I left some minor comments.
...main/java/org/apache/flink/agents/integrations/chatmodels/azureai/AzureAIChatModelSetup.java
Outdated
Show resolved
Hide resolved
| this.prompt = descriptor.getArgument("prompt"); | ||
| this.tools = (java.util.List<String>) descriptor.getArgument("tools"); | ||
| this.key = descriptor.getArgument("key"); | ||
| this.endpoint = descriptor.getArgument("endpoint"); |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
key and endpoint should be the arguments of AzureAIChatModelConnection. Different AzureAIChatModelSetup can share the same AzureAIChatModelConnection.
| java.util.Map<String, Object> params = new java.util.HashMap<>(); | ||
| params.put("model", model); | ||
| params.put("prompt", prompt); | ||
| params.put("tools", tools); |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Base class BaseChatModelSetup will process the prompt and tools directly, no need to pass them as parameters here.
| case USER: | ||
| return new ChatRequestUserMessage(content); | ||
| case ASSISTANT: | ||
| return new ChatRequestAssistantMessage(content); |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Here, only the content of ChatRequestAssistantMessage is set, but no toolCalls are set.
| case ASSISTANT: | ||
| return new ChatRequestAssistantMessage(content); | ||
| case TOOL: | ||
| return new ChatRequestToolMessage(content); |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Here, the content will be set as toolCallId of ChatRequestToolMessage, so we should pass the function call id generated by Azure llm.
ChatRequestToolMessage msg =
new ChatRequestToolMessage(
(String) message.getExtraArgs().get("externalId"));
msg.setContent(content);
return msg;
and the externalId should be set in convertToAgentsTools:
private List<Map<String, Object>> convertToAgentsTools(
List<ChatCompletionsToolCall> azureToolCalls) {
...
final Map<String, Object> call =
Map.of(
"id", functionCall.getId(),
"original_id", functionCall.getId(),
"type", "function",
"function",
Map.of(
"name", functionCall.getFunction().getName(),
"arguments",
functionCall.getFunction().getArguments()));
toolCalls.add(call);
...
}
...java/org/apache/flink/agents/integrations/chatmodels/azureai/AzureAIChatModelConnection.java
Outdated
Show resolved
Hide resolved
|
pushed new changes, please re-review this PR, thx @wenjin272 |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
| super(descriptor, getResource); | ||
| } | ||
|
|
||
| // For any other specific parameters, please refer to ChatCompletionsOptions @Override |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
The @Override annotation is commented out.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Oops, good catch! 😂 Blame it on the late-night brain fog. Will fix it
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
resolved
|
Looks like the issue of python-related test case has been self-recovered. |
|
Hi, @tsaiggo, this pr looks good to me. Since merging a PR requires apache committer privileges, @xintongsong could you take a look at this pr at your convenience? |
xintongsong
left a comment
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
@tsaiggo, thanks for your contribution.
The PR overall looks good. I just have one minor comment.
Additionally, I think this new support for Azure chat model needs to be documented. We can either do that in this PR, or create a separate issue for it.
examples/pom.xml
Outdated
| <groupId>org.apache.flink</groupId> | ||
| <artifactId>flink-agents-integrations-chat-models-azureai</artifactId> | ||
| <version>${project.version}</version> | ||
| </dependency> |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Why do we need this dependency in examples/ ?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I prefer to keep documentation updates in a separate PR
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
added the azureaichatmodel to the examples module because I intended it to be our first example of a cloud-hosted model. My original plan was to add specific use cases for cloud-hosted models, but I haven't come up with any good ideas for that yet, so it's temporarily on hold. If you think this addition should be removed for now, I'm happy to take care of that. @xintongsong
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
removed
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Yes, I think the dependency should be introduced only when it's needed. Thanks for addressing my comments.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
FYI, I just created #305 for the documentation work.
xintongsong
left a comment
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
LGTM. Merging.
Linked issue: #286
Purpose of change
This PR adds support for the Azure AI chat model as a foundational model for the agent.
Tests
I added one test to verify this change, but I ran into some issues while running the test cases:
AgentWithOllamaExample, the output stream didn't produce any data. Consequently, the same thing happened when I tested my newAgentWithAzureAIExample. My assumption is that this isn't related to the chat model implementation itself, but is more likely a higher-level issue (perhaps in the core agent logic or streaming setup).ReActAgentExample. I modified it locally to use my Azure AI implementation, and it worked perfectly. However, I did not commit this change, as my PR should not introduce breaking changes to existing examples.AgentWithAzureAIExampleand leveraged theMockChatModel(fromAgentWithResource) to add mock tests. The "happy path" for this mock test passes successfully.API
No
Documentation
Yes, exactly. It's because we now offer more model options.