You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
When using OpenAIConversationsSession with gpt-5.4 or gpt-5.5, prior assistant messages get unlinked from the conversation as new turns are added. After N turns of [user → assistant], only the most recent assistant message survives — the conversation ends up shaped like [u, u, u, u, a] instead of [u, a, u, a, u, a, u, a].
The same code works correctly on gpt-4.1 and gpt-5.2.
The Response objects still exist in the platform dashboard under Logs → Responses (verified visually). They're just no longer returned by conversations.items.list. So this is server-side unlinking, not deletion — but from the SDK user's perspective, session.get_items() silently drops them, which breaks the documented Sessions contract: "Sessions stores conversation history for a specific session, allowing agents to maintain context without requiring explicit manual memory management."
Turn 1 before: (empty) after: [u, a]
Turn 2 before: [u, a] after: [u, a, u, a]
Turn 3 before: [u, a, u, a] after: [u, a, u, a, u, a]
Turn 4 before: [u, a, u, a, u, a] after: [u, a, u, a, u, a, u, a]
Assistant items surviving: 4 / 4
Expected behavior
OpenAIConversationsSession.get_items() should return all items added by prior add_items() calls within the same session, regardless of model. This is what the SDK promises in the Sessions guide and what gpt-4.1/gpt-5.2 deliver in practice.
OpenAIConversationsSession.add_items() is a thin wrapper over conversations.items.create. Items are persisted correctly — confirmed by the after-turn dump returning the assistant message immediately after the turn.
OpenAIConversationsSession.get_items() is a thin wrapper over conversations.items.list. It returns whatever the API returns.
Between turns the SDK does not call items.delete or any destructive operation. The unlinking happens server-side, and get_items() faithfully reflects the new state.
The behavior is 100% model-dependent on the same SDK + openai client + transport.
So this is structurally upstream of the SDK — the Conversations API on 5.4/5.5 is dropping prior assistant linkages. But from the user's side, the documented OpenAIConversationsSession class is the surface that breaks, and right now there's no documented workaround within the Sessions abstraction (sessions can't be combined with conversation_id / previous_response_id / auto_previous_response_id per the docs).
Any guidance for users currently on OpenAIConversationsSession who need to keep using 5.4/5.5? Pinning to 5.2 works as a stopgap but isn't viable long-term.
Please read this first
Describe the bug
When using
OpenAIConversationsSessionwithgpt-5.4orgpt-5.5, prior assistant messages get unlinked from the conversation as new turns are added. AfterNturns of[user → assistant], only the most recent assistant message survives — the conversation ends up shaped like[u, u, u, u, a]instead of[u, a, u, a, u, a, u, a].The same code works correctly on
gpt-4.1andgpt-5.2.The Response objects still exist in the platform dashboard under Logs → Responses (verified visually). They're just no longer returned by
conversations.items.list. So this is server-side unlinking, not deletion — but from the SDK user's perspective,session.get_items()silently drops them, which breaks the documented Sessions contract: "Sessions stores conversation history for a specific session, allowing agents to maintain context without requiring explicit manual memory management."Repro
Actual output
gpt-5.4— convconv_69f98929db2c819682c44bcf7033a55c03376a9838e32d28gpt-5.5— convconv_69f989ac3290819498ee8ae0db928cad022f412c8f450f99gpt-5.2(control) — convconv_69f989e61e148197b1c114069be28d8602c54a0e013c1129Expected behavior
OpenAIConversationsSession.get_items()should return all items added by prioradd_items()calls within the same session, regardless of model. This is what the SDK promises in the Sessions guide and whatgpt-4.1/gpt-5.2deliver in practice.What I verified before opening this
OpenAIConversationsSession.add_items()is a thin wrapper overconversations.items.create. Items are persisted correctly — confirmed by theafter-turndump returning the assistant message immediately after the turn.OpenAIConversationsSession.get_items()is a thin wrapper overconversations.items.list. It returns whatever the API returns.items.deleteor any destructive operation. The unlinking happens server-side, andget_items()faithfully reflects the new state.So this is structurally upstream of the SDK — the Conversations API on
5.4/5.5is dropping prior assistant linkages. But from the user's side, the documentedOpenAIConversationsSessionclass is the surface that breaks, and right now there's no documented workaround within the Sessions abstraction (sessions can't be combined withconversation_id/previous_response_id/auto_previous_response_idper the docs).I've also filed a parallel report on the platform forum since the fix likely needs to happen on the Conversations API side: https://community.openai.com/t/openaiconversationssession-loses-prior-assistant-items-on-gpt-5-4-gpt-5-5/1380324
Debug information
0.15.12.34.03.12gpt-5.4,gpt-5.5gpt-4.1,gpt-5.2Asks
OpenAIConversationsSessionwho need to keep using5.4/5.5? Pinning to5.2works as a stopgap but isn't viable long-term.