-
Notifications
You must be signed in to change notification settings - Fork 3k
Description
Problem (one or two sentences)
OpenAiHandler only checks for reasoning_content field in responses (both streaming and non-streaming), but some OpenAI-compatible providers return reasoning in the reasoning field instead (following OpenAI's recommendations). This causes reasoning/thinking content to not be displayed when using these providers.
Context (who is affected and when)
Who is affected:
- Users using OpenAI-compatible providers that return reasoning in the
reasoningfield - Users using vLLM versions after 0.15.1 (which removed
reasoning_contentfield) - Anyone using models that return reasoning in the
reasoningfield instead ofreasoning_content
When this happens:
- When streaming responses from OpenAI-compatible providers that use
reasoningfield - When using any provider that returns
reasoninginstead ofreasoning_content
Background:
OpenAI recommends using the reasoning field for chain-of-thought content in Chat Completions API. Some providers (like vLLM) have transitioned from reasoning_content to reasoning to align with these recommendations. vLLM project RFC #27755 documents this transition.
Reproduction steps
- Set up an OpenAI-compatible provider that returns reasoning in the
reasoningfield (e.g., vLLM after version 0.15.1 ) - Configure Roo Code to use the provider:
{ "apiProvider": "openai", "openAiBaseUrl": "http://your-vllm-server:port/v1", "openAiModelId": "your-model like glm 4.7 or minimax 2.5" } - Send a message to a reasoning-capable model
- Observe that the "Thinking" section does not appear in the UI
Expected result
The "Thinking" section should display the reasoning content from the reasoning field, just as it does with the reasoning_content field.
Actual result
The "Thinking" section does not appear at all. The reasoning content is silently ignored because OpenAiHandler only checks for reasoning_content field, not reasoning. This affects both streaming and non-streaming responses.
Variations tried (optional)
No response
App Version
v3.47.3
API Provider (optional)
OpenAI Compatible
Model Used (optional)
glm 4.7
Roo Code Task Links (optional)
No response