-
Notifications
You must be signed in to change notification settings - Fork 10.2k
LLM Responses Not Revealed Unless Switching Between Chats #13133
Description
What version of the Codex App are you using (From “About Codex” dialog)?
26.227.1448 (747)
What subscription do you have?
Pro
What platform is your computer?
Darwin 25.3.0 arm64 arm
What issue are you seeing?
When starting a prompt, Codex does not always reveal the LLM's responses. This often appears as "Thinking" for a longer period of time than normal. Eventually it may list its final response when the work is done, but not show the steps it has taken along the way.
If I switch to another chat and come back, its responses (that were invisible earlier) appear as expected and typically carry out in full. Because of this, I'm often forced to switch back and forth between chats to see the LLM's progression through code changes and the LLMs thoughts in the Codex app. Occasionally LLM-invoked builds do not start until I switch back and forth between chats. (Upon returning to the relevant chat, a build will start with a 1s timer - as if it just begun.)
What steps can reproduce the bug?
I have yet to understand the steps to reproduce this bug. I cannot tell if it started after the 26.227.1448 (747), after I added an MCP server via Terminal, or after a very long chat to create an extensive feature in the App (Session# 019c9f22-1ee2-7042-adac-19ce34920e57).
I noticed it towards the end of my long chat (within the last 3-5 responses), but the bug has somehow extended itself into all my other chats. I have tried archiving the chat to see if it would solve the issue with the app, but to no avail.
What is the expected behavior?
The expected behavior is to continue to see the LLM's responses as it works through the project files and adjusts code. I should not have to "leave and come back" for it to be revealed each time.
Additional information
No response