Skip to content

Codex cli 0.99 uses the default model instead of the one specified in the config.toml #11592

@sameetn

Description

@sameetn

What version of Codex CLI is running?

0.99

What subscription do you have?

None

Which model were you using?

gpt-5-mini in Azure Open AI

What platform is your computer?

Linux

What terminal emulator and version are you using (if applicable)?

No response

What issue are you seeing?

Looks like the codex cli is defaulting to gpt-5 model instead of the specified gpt-5-mini in the config.toml below

sandbox_mode = "workspace-write"
approval_policy = "on-failure"
 
model = "gpt-5-mini"  # Replace with your actual model deployment name
model_provider = "azure"
model_reasoning_effort = "medium"
 
[model_providers.azure]
name = "Azure OpenAI"
base_url = "http://<custom Azure URL>"
env_key = "OPENAI_API_KEY"
wire_api = "responses"

■ unexpected status 401 Unauthorized: key not allowed to access model. This key
can only access models=['gpt-5-mini']. Tried to access gpt-5, url: https:///responses

What steps can reproduce the bug?

npm i -g !openai/codex@0.99

setup config.toml to use anything other than OPenAI and start codex and ask it a question,

What is the expected behavior?

No response

Additional information

No response

Metadata

Metadata

Assignees

No one assigned

    Labels

    CLIIssues related to the Codex CLIazureIssues related to the Azure-hosted OpenAI modelsbugSomething isn't working

    Type

    No type
    No fields configured for issues without a type.

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions