Skip to content

find_model_by_namespaced_suffix rejects hyphens in namespace, breaking custom provider slugs like llm-gateway/gpt-5.4 #14276

@kapral18

Description

@kapral18

What version of Codex CLI is running?

0.113.0

What subscription do you have?

Plus

Which model were you using?

llm-gateway/gpt-5.4 (via custom model_providers.litellm)

What platform is your computer?

Darwin 25.3.0 arm64 arm

What terminal emulator and version are you using (if applicable)?

Ghostty 1.6.2 + tmux 3.5a

What issue are you seeing?

find_model_by_namespaced_suffix in codex-rs/core/src/models_manager/manager.rs rejects any namespace containing a hyphen (-). Model slugs like llm-gateway/gpt-5.4 cannot resolve to the bundled gpt-5.4 metadata, even though the suffix matches a known model.

The check at L195-L199:

if !namespace
    .chars()
    .all(|c| c.is_ascii_alphanumeric() || c == '_')
{
    return None;
}

Only allows [a-zA-Z0-9_] in the namespace. Hyphens are common in provider/proxy prefixes (e.g. llm-gateway/, my-proxy/).

Codex falls back to degraded metadata and emits:

⚠ Model metadata for llm-gateway/gpt-5.4 not found. Defaulting to fallback metadata; this can degrade performance and cause issues.

The only workaround is providing a full model_catalog_json that clones the entire bundled catalog (~230 KB) just to add one alias entry.

What steps can reproduce the bug?

  1. Configure a custom model provider in config.toml:
model = "llm-gateway/gpt-5.4"
model_provider = "litellm"

[model_providers.litellm]
name = "LiteLLM"
base_url = "https://your-litellm-proxy/v1"
env_key = "LITELLM_PROXY_KEY"
wire_api = "responses"
  1. Start Codex.
  2. Observe the fallback metadata warning.

The namespace llm-gateway contains a hyphen which fails the is_ascii_alphanumeric() || c == '_' check in find_model_by_namespaced_suffix, so the suffix gpt-5.4 is never looked up in the bundled catalog.

What is the expected behavior?

llm-gateway/gpt-5.4 should resolve to the bundled gpt-5.4 model metadata via namespace stripping, the same way custom_provider/gpt-5.4 (underscore namespace) already works.

Suggested fix: allow - in the namespace character set:

.all(|c| c.is_ascii_alphanumeric() || c == '_' || c == '-')

Additional information

  • Related: Support for overriding Model metada #12380 (overriding model metadata) — different ask but same pain point area.
  • The llm-gateway/ prefix is a common LiteLLM proxy convention and cannot be changed on the proxy side.
  • Current workaround: provide a full model_catalog_json that clones the entire bundled catalog (~230 KB, 12 models) just to add one alias entry — fragile and depends on models_cache.json (a runtime artifact).

Metadata

Metadata

Assignees

No one assigned

    Labels

    agentIssues related to the core agent loopbugSomething isn't workingcustom-modelIssues related to custom model providers (including local models)

    Type

    No type

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions