Description
OpenAI Codex subscription streams can return transient overload errors in this shape:
{"type":"error","sequence_number":2,"error":{"type":"service_unavailable_error","code":"server_is_overloaded","message":"Our servers are currently overloaded. Please try again later.","param":null}}
OpenCode should treat this as a retryable provider overload. Today this nested stream error shape can miss retry classification because the overload code/type are nested under error.
Related broader reports: #16214 and #21893.
Plugins
None
OpenCode version
Latest dev
Steps to reproduce
- Use an OpenAI Codex subscription model.
- Hit a transient streamed overload response with
error.code: server_is_overloaded.
- Observe that the error can terminate instead of entering the existing retry path.
Screenshot and/or share link
No response
Operating System
No response
Terminal
No response
Description
OpenAI Codex subscription streams can return transient overload errors in this shape:
{"type":"error","sequence_number":2,"error":{"type":"service_unavailable_error","code":"server_is_overloaded","message":"Our servers are currently overloaded. Please try again later.","param":null}}OpenCode should treat this as a retryable provider overload. Today this nested stream error shape can miss retry classification because the overload code/type are nested under
error.Related broader reports: #16214 and #21893.
Plugins
None
OpenCode version
Latest dev
Steps to reproduce
error.code: server_is_overloaded.Screenshot and/or share link
No response
Operating System
No response
Terminal
No response