Skip to content

Missing previous_response_id support breaks multi-turn conversations with Azure OpenAI Responses API #3841

@TyroneXie

Description

@TyroneXie

What version of Codex is running?

OpenAI Codex v0.38.0 (research preview)

Which model were you using?

gpt-5-2025-08-07 via Azure OpenAI

What platform is your computer?

Linux n37-127-010 5.4.0-150-generic #167-Ubuntu SMP Mon May 15 17:35:05 UTC 2023 x86_64 x86_64 x86_64 GNU/Linux

What steps can reproduce the bug?

Configuration:

codex \
  -c model_provider=azure \
  -c model="gpt-5-2025-08-07" \
  -c model_providers.azure.name="Azure" \
  -c model_providers.azure.base_url="http://your-azure-openai-endpoint.com/path" \
  -c model_providers.azure.wire_api="responses" \
  -c 'model_providers.azure.query_params={ak="your-api-key"}' \
  -c model_reasoning_effort="minimal" \
  -c model_reasoning_summary="detailed" \
  exec "analyze this repository structure"

Steps to reproduce:

  1. Configure Codex CLI to use Azure OpenAI with Responses API (as shown above)
  2. Run any command that would trigger tool execution (like exec "analyze this directory")
  3. The first command executes successfully
  4. The second request in the conversation fails with the error below

Root Cause: Codex CLI's ResponsesApiRequest struct in codex-rs/core/src/client_common.rs is missing the previous_response_id
field required for proper multi-turn conversation management per the OpenAI Responses API specification.

What is the expected behavior?

For multi-turn conversations using the Responses API:

  1. First request: Send full input array with store: true, no previous_response_id
  2. Subsequent requests: Send only new user input + previous_response_id referencing the last response
  3. Server automatically: Retrieves and manages conversation context based on the previous_response_id

This is the standard behavior documented in OpenAI's Responses API for proper context management and should work seamlessly
with Azure OpenAI and other compatible implementations.

What do you see instead?

Error on second request in conversation:

stream error: unexpected status 400 Bad Request: {
"error": {
"message": "code: missing_required_parameter; message: Missing required parameter: 'input[2].id'.",
"type": "invalid_request_error",
"param": "input[2].id",
"code": "-4003"
}
}

Sometimes also:

stream error: unexpected status 400 Bad Request: {
"error": {
"message": "code: ; message: The encrypted content for item rs_68cbab0bc380819589107d8ecf87f5f00bc9dfe08376e305 could
not be verified.",
"type": "invalid_request_error",
"code": "-4003"
}
}

Current problematic behavior:

• Single-turn conversations work perfectly
• Multi-turn conversations fail because Codex sends the entire conversation history in each request instead of using
previous_response_id for context reference
• Azure OpenAI correctly rejects the malformed requests

Additional information

Code Analysis: The issue is in codex-rs/core/src/client_common.rs where ResponsesApiRequest struct lacks the
previous_response_id field:

// Current implementation - MISSING previous_response_id
#[derive(Debug, Serialize)]
pub(crate) struct ResponsesApiRequest<'a> {
    pub(crate) model: &'a str,
    pub(crate) instructions: &'a str,
    pub(crate) input: &'a Vec<ResponseItem>,  // Always sends full history
    pub(crate) tools: &'a [serde_json::Value],
    // ... other fields ...
    // MISSING: previous_response_id: Option<String>,
}

Expected fix:

#[derive(Debug, Serialize)]
pub(crate) struct ResponsesApiRequest<'a> {
    // ... existing fields ...
    #[serde(skip_serializing_if = "Option::is_none")]
    pub(crate) previous_response_id: Option<String>,  // Add this field
}

Impact:

• Prevents using Codex CLI with Azure OpenAI for complex multi-step analysis tasks
• Forces users to either use standard OpenAI API or break complex tasks into separate single-turn calls
• Affects any Responses API implementation that properly validates multi-turn conversation format

Workaround: Currently using single-turn conversations or switching to standard OpenAI API for multi-turn tasks.

This appears to be a missing feature rather than a regression, as the Responses API multi-turn conversation management was
likely never fully implemented in Codex CLI.

Metadata

Metadata

Assignees

No one assigned

    Labels

    azurebugSomething isn't working

    Type

    No type

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions