Propagate reasoning_content through LLM pipeline (fixes #563)#565
Conversation
DeepSeek v4-pro with thinking mode rejects requests when reasoning_content from prior assistant turns is not echoed back. This change: - Adds ReasoningContent to ChatMessage, LLMResponse, and LLMStreamChunk - Propagates reasoning content through ChatRuntime streaming rounds - Appends reasoning_content to conversation history for multi-turn - Implements ExtractReasoningContent in MEAILLMProvider - Wires reasoning content into non-streaming ConvertResponse Closes #563
There was a problem hiding this comment.
💡 Codex Review
Here are some automated review suggestions for this pull request.
Reviewed commit: b8128cb977
ℹ️ About Codex in GitHub
Your team has set up Codex to review pull requests in this repo. Reviews are triggered when you
- Open a pull request for review
- Mark a draft as ready
- Comment "@codex review".
If Codex has suggestions, it will comment; otherwise it will react with 👍.
Codex can also answer questions or update the PR. Try commenting "@codex address that feedback".
| // ChatResponse.Messages contains all reply messages | ||
| var lastMessage = response.Messages.LastOrDefault(); | ||
| var content = ExtractMessageText(lastMessage); | ||
| var reasoningContent = ExtractReasoningContent(lastMessage); |
There was a problem hiding this comment.
Include reasoning content in outbound MEAI chat messages
This change extracts and stores ReasoningContent from responses, but it does not add the corresponding outbound serialization in ConvertMessages (which still only emits text/content parts/tool calls). In multi-turn thinking-mode sessions, assistant turns are sent back without reasoning_content, so providers that require replaying reasoning (the DeepSeek 400 case described in this commit) will still fail on follow-up requests.
Useful? React with 👍 / 👎.
There was a problem hiding this comment.
Fixed in 2690a61: ConvertMessages() now emits TextReasoningContent for assistant messages with ReasoningContent, including the tool-calls branch.
| } | ||
|
|
||
| AppendAssistantMessage(messages, pendingHistoryMessages, parsed.CleanedContent, toolCalls: null); | ||
| AppendAssistantMessage(messages, pendingHistoryMessages, parsed.CleanedContent, reasoningContent: null, toolCalls: null); |
There was a problem hiding this comment.
Preserve reasoning in parsed tool-call fallback history
In the DSML/XML fallback path, assistant history is appended with reasoningContent: null even when the streamed round produced reasoning. That drops the reasoning trace before the next LLM round, so conversations that hit this fallback can still violate providers that require prior reasoning_content to be echoed back, causing the same follow-up request failures this patch is trying to prevent.
Useful? React with 👍 / 👎.
There was a problem hiding this comment.
Fixed in 2690a61: DSML fallback path now passes roundResult.ReasoningContent instead of null to AppendAssistantMessage.
eanzhao
left a comment
There was a problem hiding this comment.
Review
🔴 Blocking: ConvertMessages() 未回传 ReasoningContent 到 DeepSeek
PR 正确地捕获了 ReasoningContent 并存入 ChatMessage,但 MEAILLMProvider.ConvertMessages() (MEAILLMProvider.cs:229) 在构建 outgoing request 时完全没有读取 msg.ReasoningContent。
- 入站: DeepSeek 返回 reasoning → 捕获 ✓ → 存储 ✓ → 历史追加 ✓
- 出站: 下次请求 →
ConvertMessages()构建请求 → 未包含 ReasoningContent → 未发送回 DeepSeek
issue #563 的报错原文是 reasoning_content must be passed back to the API。没有出站回传,HTTP 400 依然会出现。
修复路径: 在 ConvertMessages() 中,当 msg.Role == "assistant" 且 msg.ReasoningContent 非空时,将 reasoning 作为 TextReasoningContent 追加到 meaiMsg.Contents。
🟡 其他应修复点
ChatHistory.Export/Import+SerializableMessage— 序列化未包含ReasoningContent,跨会话/重载时历史持久化丢失 reasoning。ChatRuntime.BuildSyntheticChunks()(L843) — 非流式 response 转合成 chunk 时未 emitDeltaReasoningContent,Terminate/fallback 路径的 reasoning 不会送达流式消费者。ToolCallLoop.cs— 多处ChatMessage.Assistant(...)(L114/147/153/171/189/225/244/248) 未传 reasoning,tool call 多轮对话时 reasoning 丢失。
✅ 正确的部分
ChatMessage.ReasoningContent、LLMResponse.ReasoningContent模型改动正确StreamLlmRoundCoreAsync的流式累积和AppendAssistantMessage传播完整MEAILLMProvider.ConvertResponse非流式提取ExtractReasoningContent正确
Codecov Report❌ Patch coverage is @@ Coverage Diff @@
## feature/lark-bot #565 +/- ##
====================================================
+ Coverage 72.01% 72.08% +0.07%
====================================================
Files 1255 1255
Lines 90723 90759 +36
Branches 11877 11891 +14
====================================================
+ Hits 65331 65428 +97
+ Misses 20706 20646 -60
+ Partials 4686 4685 -1
Flags with carried forward coverage won't be shown. Click here to find out more.
... and 5 files with indirect coverage changes 🚀 New features to boost your workflow:
|
…ths, history persistence
Fix Check Verdict — 2/2 inline comments resolved
Remaining gap (not covered by original comments)The AppendAssistantMessage(messages, pendingHistoryMessages, finalParsed.CleanedContent, reasoningContent: null, toolCalls: null);If this path is reachable in thinking-mode sessions, it could cause the same DeepSeek 400 error. Worth a follow-up. Evaluated by 6 reviewers (deepseek-v4-pro, glm-5.1, mimo-v2.5-pro, kimi, gemini, codex) via |
Summary
ReasoningContentfield toChatMessage,LLMResponse, andLLMStreamChunkto carry thinking-mode reasoning content through the pipelineChatRuntimestreaming rounds and appends it to conversation history for multi-turn conversationsExtractReasoningContentinMEAILLMProviderfor both streaming and non-streaming pathsreasoning_contentin the thinking mode must be passed back to the API"Impact
src/Aevatar.AI.Abstractions/LLMProviders/LLMRequest.cs—ChatMessage.ReasoningContentsrc/Aevatar.AI.Abstractions/LLMProviders/LLMResponse.cs—LLMResponse.ReasoningContent,LLMStreamChunk.DeltaReasoningContentsrc/Aevatar.AI.Core/Chat/ChatRuntime.cs— streaming round propagation + history appendsrc/Aevatar.AI.LLMProviders.MEAI/MEAILLMProvider.cs—ExtractReasoningContent+ConvertResponsewiringVerification
dotnet build— 0 errorsdotnet test test/Aevatar.AI.Tests— 500 passed, 0 failedbash tools/ci/architecture_guards.sh— all passedCloses #563