Fix OpenAI reasoning item error in Responses API #59
Add this suggestion to a batch that can be applied as a single commit.
This suggestion is invalid because no changes were made to the code.
Suggestions cannot be applied while the pull request is closed.
Suggestions cannot be applied while viewing a subset of changes.
Only one suggestion per line can be applied in a batch.
Add this suggestion to a batch that can be applied as a single commit.
Applying suggestions on deleted lines is not supported.
You must change the existing code in this line in order to create a valid suggestion.
Outdated suggestions cannot be applied.
This suggestion has been applied or marked resolved.
Suggestions cannot be applied from pending reviews.
Suggestions cannot be applied on multi-line comments.
Suggestions cannot be applied while the pull request is queued to merge.
Suggestion cannot be applied right now. Please check back later.
Problem
OpenAI Responses API was intermittently failing with error:
This occurred when using reasoning models (gpt-5, o3, o4-mini) with tool calls in multi-turn conversations.
Root Cause
From the Vercel AI SDK documentation:
Even though we're using the default
store: trueandpreviousResponseIdfor persistence, we still need to explicitly include reasoning encrypted content when tool calls are involved. The reasoning items have IDs (likers_*) that must be properly linked to their following items.Solution
Added
include: ['reasoning.encrypted_content']to OpenAI provider options whenreasoningEffortis configured (meaning reasoning is enabled).This ensures reasoning context is properly preserved across multi-turn conversations with tool calls.
Changes
buildProviderOptions()insrc/utils/ai/providerOptions.tsincludewhen reasoning is actually enabledTesting
Not adding automated tests as the error was intermittent and difficult to reliably reproduce. The fix is based directly on OpenAI/Vercel SDK documentation requirements.