Describe the bug
I'm getting server error when using Codex 5.3, tested with GitHub Copilot CLI v0.0.420. Other models do work.
Affected version
GitHub Copilot CLI 0.0.420.
Steps to reproduce the behavior
- Start GitHub Copilot CLI
- Select model Codex 5.3
- Enter a prompt (any prompt)
Expected behavior
The LLM/AI should write a response. Instead of it, I get "server error".
Additional context
Linux x86