What version of Codex CLI is running?
0.99
What subscription do you have?
None
Which model were you using?
gpt-5-mini in Azure Open AI
What platform is your computer?
Linux
What terminal emulator and version are you using (if applicable)?
No response
What issue are you seeing?
Looks like the codex cli is defaulting to gpt-5 model instead of the specified gpt-5-mini in the config.toml below
sandbox_mode = "workspace-write"
approval_policy = "on-failure"
model = "gpt-5-mini" # Replace with your actual model deployment name
model_provider = "azure"
model_reasoning_effort = "medium"
[model_providers.azure]
name = "Azure OpenAI"
base_url = "http://<custom Azure URL>"
env_key = "OPENAI_API_KEY"
wire_api = "responses"
■ unexpected status 401 Unauthorized: key not allowed to access model. This key
can only access models=['gpt-5-mini']. Tried to access gpt-5, url: https:///responses
What steps can reproduce the bug?
npm i -g !openai/codex@0.99
setup config.toml to use anything other than OPenAI and start codex and ask it a question,
What is the expected behavior?
No response
Additional information
No response
What version of Codex CLI is running?
0.99
What subscription do you have?
None
Which model were you using?
gpt-5-mini in Azure Open AI
What platform is your computer?
Linux
What terminal emulator and version are you using (if applicable)?
No response
What issue are you seeing?
Looks like the codex cli is defaulting to gpt-5 model instead of the specified gpt-5-mini in the config.toml below
■ unexpected status 401 Unauthorized: key not allowed to access model. This key
can only access models=['gpt-5-mini']. Tried to access gpt-5, url: https:///responses
What steps can reproduce the bug?
npm i -g !openai/codex@0.99
setup config.toml to use anything other than OPenAI and start codex and ask it a question,
What is the expected behavior?
No response
Additional information
No response