Skip to content

PR #20098 breaks project configs for model_providers #22222

@Austinhs

Description

@Austinhs

What version of Codex CLI is running?

codex-cli 0.130.0

What subscription do you have?

API & Pro

Which model were you using?

5.4

What platform is your computer?

Linux 6.6.114.1-microsoft-standard-WSL2 x86_64 x86_64 (Using WSL)

What terminal emulator and version are you using (if applicable)?

Windows Terminal

What issue are you seeing?

⚠ Ignored unsupported project-local config keys in /home/.../.../.../{repo}/.codex/config.toml: model_provider, model_providers. If you want these settings to apply, manually set them in your user-level config.toml.

Caused by PR: #20098

What steps can reproduce the bug?

Run codex within a repo with local .codex/config.toml setup.

What is the expected behavior?

To use repo specific LiteLLM model, but instead it errors and then falls back to default openai route

Additional information

This affects many local repo configurations. These should remain as a project configuration -- especially for people who work at multiple companies and use multiple configurations. At a bare minimum, we should be able to specify what provider to use in the project otherwise, this configuration is useless and should be removed from the documentation.

Metadata

Metadata

Assignees

No one assigned

    Labels

    bugSomething isn't workingconfigIssues involving config.toml, config keys, config merging, or config updatescustom-modelIssues related to custom model providers (including local models)

    Type

    No type
    No fields configured for issues without a type.

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions