-
-
Notifications
You must be signed in to change notification settings - Fork 739
fix: Multi-turn tools use error when using streaming output #1990
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Conversation
…ry in streaming mode
流式聊天中 ToolCallsResult 处理的类图classDiagram
class ToolCallsResult {
+to_openai_messages()
}
class ChatSource {
+text_chat_stream(..., tool_calls_result)
}
ChatSource --> ToolCallsResult : uses
class LLMResponse
ChatSource ..> LLMResponse : yields
文件级别变更
可能相关的 issue
提示和命令与 Sourcery 互动
自定义您的体验访问您的 仪表板 以:
获取帮助Original review guide in EnglishReviewer's GuideRefactors streaming chat methods to properly support multi-turn tool calls in streaming output and cleans up formatting. Sequence diagram for multi-turn tool calls in streaming chatsequenceDiagram
participant User as actor User
participant ChatSource as Chat Source (Gemini/Anthropic)
participant ToolCallsResult as ToolCallsResult
participant LLM as LLM Model
User->>ChatSource: text_chat_stream(..., tool_calls_result)
alt tool_calls_result is not None
ChatSource->>ToolCallsResult: to_openai_messages()
ChatSource->>ChatSource: context_query.extend(...)
end
ChatSource->>LLM: Send context_query (with tool call messages)
LLM-->>ChatSource: Streaming response
ChatSource-->>User: Streamed output
Class diagram for ToolCallsResult handling in streaming chatclassDiagram
class ToolCallsResult {
+to_openai_messages()
}
class ChatSource {
+text_chat_stream(..., tool_calls_result)
}
ChatSource --> ToolCallsResult : uses
class LLMResponse
ChatSource ..> LLMResponse : yields
File-Level Changes
Possibly linked issues
Tips and commandsInteracting with Sourcery
Customizing Your ExperienceAccess your dashboard to:
Getting Help
|
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
嘿 @Soulter - 我已经查看了你的更改 - 这里有一些反馈:
- 删除
text_chat_stream
中的显式类型提示会降低类型清晰度 - 考虑重新引入注解或验证新签名是否仍然与所有调用者对齐。 - 你在 Gemini 和 Anthropic 来源中复制了相同的
tool_calls_result
逻辑 - 将其提取到共享助手以减少重复。 - 有几个纯粹的样式格式调整(例如,包装多行文字)会使差异膨胀 - 考虑限制格式更改,以便功能修复更加清晰。
Prompt for AI Agents
请解决此代码审查中的评论:
## 总体评论
- 删除 `text_chat_stream` 中的显式类型提示会降低类型清晰度 - 考虑重新引入注解或验证新签名是否仍然与所有调用者对齐。
- 你在 Gemini 和 Anthropic 来源中复制了相同的 `tool_calls_result` 逻辑 - 将其提取到共享助手以减少重复。
- 有几个纯粹的样式格式调整(例如,包装多行文字)会使差异膨胀 - 考虑限制格式更改,以便功能修复更加清晰。
## 单独评论
### 评论 1
<location> `astrbot/core/provider/sources/gemini_source.py:570` </location>
<code_context>
# tool calls result
if tool_calls_result:
- context_query.extend(tool_calls_result.to_openai_messages())
+ if not isinstance(tool_calls_result, list):
</code_context>
<issue_to_address>
考虑在循环之前将 tool_calls_result 规范化为列表,以避免重复的逻辑。
```python
# Replace the nested isinstance branch with a single normalized loop
if tool_calls_result:
# Normalize to list so we only have one loop
tcr_items = (
tool_calls_result
if isinstance(tool_calls_result, list)
else [tool_calls_result]
)
for tcr in tcr_items:
context_query.extend(tcr.to_openai_messages())
```
这消除了重复的 `extend` 逻辑,同时保留了对单个和多个 ToolCallsResult 的行为。
</issue_to_address>
Sourcery is free for open source - if you like our reviews please consider sharing them ✨
Original comment in English
Hey @Soulter - I've reviewed your changes - here's some feedback:
- The removal of explicit type hints in
text_chat_stream
reduces type clarity—consider reintroducing annotations or verifying that the new signature still aligns with all callers. - You’re duplicating the same
tool_calls_result
logic in both Gemini and Anthropic sources—extract it into a shared helper to reduce repetition. - There are several purely stylistic formatting tweaks (e.g. wrapping multi-line literals) that bloat the diff—consider limiting formatting changes so the functional fix stands out more clearly.
Prompt for AI Agents
Please address the comments from this code review:
## Overall Comments
- The removal of explicit type hints in `text_chat_stream` reduces type clarity—consider reintroducing annotations or verifying that the new signature still aligns with all callers.
- You’re duplicating the same `tool_calls_result` logic in both Gemini and Anthropic sources—extract it into a shared helper to reduce repetition.
- There are several purely stylistic formatting tweaks (e.g. wrapping multi-line literals) that bloat the diff—consider limiting formatting changes so the functional fix stands out more clearly.
## Individual Comments
### Comment 1
<location> `astrbot/core/provider/sources/gemini_source.py:570` </location>
<code_context>
# tool calls result
if tool_calls_result:
- context_query.extend(tool_calls_result.to_openai_messages())
+ if not isinstance(tool_calls_result, list):
</code_context>
<issue_to_address>
Consider normalizing tool_calls_result to a list before looping to avoid duplicated logic.
```python
# Replace the nested isinstance branch with a single normalized loop
if tool_calls_result:
# Normalize to list so we only have one loop
tcr_items = (
tool_calls_result
if isinstance(tool_calls_result, list)
else [tool_calls_result]
)
for tcr in tcr_items:
context_query.extend(tcr.to_openai_messages())
```
This removes the duplicated `extend` logic while preserving the behavior for both single and multiple ToolCallsResult.
</issue_to_address>
Help me be more useful! Please click 👍 or 👎 on each comment and I'll use the feedback to improve your reviews.
@@ -566,7 +568,11 @@ async def text_chat_stream( | |||
|
|||
# tool calls result | |||
if tool_calls_result: |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
issue (complexity): 考虑在循环之前将 tool_calls_result 规范化为列表,以避免重复的逻辑。
# Replace the nested isinstance branch with a single normalized loop
if tool_calls_result:
# Normalize to list so we only have one loop
tcr_items = (
tool_calls_result
if isinstance(tool_calls_result, list)
else [tool_calls_result]
)
for tcr in tcr_items:
context_query.extend(tcr.to_openai_messages())
这消除了重复的 extend
逻辑,同时保留了对单个和多个 ToolCallsResult 的行为。
Original comment in English
issue (complexity): Consider normalizing tool_calls_result to a list before looping to avoid duplicated logic.
# Replace the nested isinstance branch with a single normalized loop
if tool_calls_result:
# Normalize to list so we only have one loop
tcr_items = (
tool_calls_result
if isinstance(tool_calls_result, list)
else [tool_calls_result]
)
for tcr in tcr_items:
context_query.extend(tcr.to_openai_messages())
This removes the duplicated extend
logic while preserving the behavior for both single and multiple ToolCallsResult.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Pull Request Overview
This PR fixes errors in multi-turn tool usage during streaming outputs by updating how tool call results are handled and streamlining function signatures and data formatting.
- Allow
tool_calls_result
to be a list of results and handle each appropriately - Simplify and reformat
text_chat_stream
method signature and message assembly - Remove unused import and tidy up append formatting
Reviewed Changes
Copilot reviewed 2 out of 2 changed files in this pull request and generated 1 comment.
File | Description |
---|---|
astrbot/core/provider/sources/gemini_source.py | Adjusted text_chat_stream signature, added list handling for tool_calls_result , removed unused import, and reformatted append calls |
astrbot/core/provider/sources/anthropic_source.py | Added list handling for tool_calls_result in the streaming chat method |
Comments suppressed due to low confidence (2)
astrbot/core/provider/sources/gemini_source.py:570
- Change the conditional to
if tool_calls_result is not None:
so that an empty list ([]) is handled correctly instead of being skipped.
if tool_calls_result:
astrbot/core/provider/sources/anthropic_source.py:302
- Use
if tool_calls_result is not None:
to ensure empty lists are processed and not skipped.
if tool_calls_result:
prompt, | ||
session_id=None, | ||
image_urls=None, | ||
func_tool=None, | ||
contexts=None, | ||
system_prompt=None, | ||
tool_calls_result=None, |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
[nitpick] Consider reintroducing type annotations (e.g., prompt: str
, session_id: Optional[str]
, image_urls: Optional[list[str]]
, tool_calls_result: ToolCallsResult | list[ToolCallsResult]
) to improve readability and type safety.
prompt, | |
session_id=None, | |
image_urls=None, | |
func_tool=None, | |
contexts=None, | |
system_prompt=None, | |
tool_calls_result=None, | |
prompt: str, | |
session_id: Optional[str] = None, | |
image_urls: Optional[list[str]] = None, | |
func_tool: Optional[FuncCall] = None, | |
contexts: Optional[list[dict]] = None, | |
system_prompt: Optional[str] = None, | |
tool_calls_result: ToolCallsResult | list[ToolCallsResult] = None, |
Copilot uses AI. Check for mistakes.
解决了 #1989
Check
requirements.txt
和pyproject.toml
文件相应位置。好的,这是翻译成中文的 pull request 摘要:
Sourcery 总结
修复了多轮工具调用中的流式输出问题,通过处理 Gemini 和 Anthropic 聊天源中工具调用结果的列表来实现。
Bug 修复:
tool_calls_result
作为列表处理,以便在流式模式下扩展上下文消息来支持多轮工具。增强功能:
text_chat_stream
签名中删除静态类型提示,以提高灵活性。Original summary in English
Summary by Sourcery
Fix streaming output for multi-turn tool calls by handling lists of tool call results in Gemini and Anthropic chat sources
Bug Fixes:
Enhancements: