Skip to content

Facing frequent stream disconnection issue while using Codex CLI #8865

@D951516

Description

@D951516

What version of Codex is running?

codex-cli 0.77.0

What subscription do you have?

Using Azure Microsoft Foundry Open AI Models

Which model were you using?

gpt-5-codex , gpt-5.1-codex

What platform is your computer?

Microsoft Windows NT 10.0.26100.0 x64

What issue are you seeing?

We are repeatedly experiencing streaming failures while executing commands using OpenAI models.
Users consistently encounter the following error:

stream disconnected before completion: response.failed event received

This occurs even for simple prompts, and the response stream fails before completion.

We also intermittently receive the following error:

"Item with id 'rs_06b3ecdc79f7f90b006957c907543438197a56565d3b3ff2278' not found."
"type": "invalid_request_error"
"param": "input"

Actions Taken So Far

To mitigate the streaming issue, we have already:

Increased the quota limits

Added stream retry logic/parameters to allow session continuation
→ This prevents the session from dropping immediately, but causes extreme slowness during execution

Even after these changes, the stream continues to disconnect intermittently.

What steps can reproduce the bug?

It is occurring randomly even for simple Hello message.

What is the expected behavior?

No response

Additional information

No response

Metadata

Metadata

Assignees

No one assigned

    Labels

    azureIssues related to the Azure-hosted OpenAI modelsbugSomething isn't working

    Type

    No type
    No fields configured for issues without a type.

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions