Summary
Add native GPT-5.1 model awareness and reasoningEffort: "none" support to opencode core so providers like the Codex OAuth plugin can match the latest Codex CLI behavior without fighting the framework.
Background
In openai/codex (the official Codex CLI), GPT-5.1 is now the primary family, with:
gpt-5.1 general models supporting a new reasoningEffort: "none" mode.
gpt-5.1-codex and gpt-5.1-codex-mini for code-heavy workloads.
- Model presets wired into
model_presets.rs with specific reasoning defaults.
In opencode today (based on gh search code):
- There are no direct references to GPT-5.1 model IDs or
reasoningEffort: "none".
- Providers can approximate GPT-5.1 behavior via their own config, but the core schema and model picker aren’t aware of these new variants.
In the Codex OAuth plugin (@promethean-os/opencode-openai-codex-auth):
- We’ve already added GPT-5.1 presets (e.g.
gpt-5.1-codex-low, gpt-5.1-none) and adjusted plugin-side normalization + reasoning heuristics to match Codex CLI.
- However, these show up as “just more models” from opencode’s perspective; the core doesn’t know they are the new default family.
Proposed Changes
-
Extend model configuration schema
- Allow
reasoningEffort to include "none" in addition to "minimal" | "low" | "medium" | "high".
- Ensure this is reflected consistently in:
- TypeScript/JSON schema used by opencode config.
- Any validation logic around model options in core.
-
Add GPT-5.1 model awareness to core
- Introduce GPT-5.1 model IDs at the provider level (for
provider.openai):
gpt-5.1, gpt-5.1-codex, gpt-5.1-codex-mini.
- Mark GPT-5.1 as the preferred family for new configs, while keeping GPT-5 as a legacy/compatibility option.
- Ensure the model picker / TUI shows GPT-5.1 variants with friendly names and correct reasoning defaults.
-
Reasoning defaults aligned with Codex CLI
- For GPT-5.1 general models:
- Default
reasoningEffort: "none" (new no-reasoning mode) when not overridden.
- For GPT-5.1 Codex / Codex Mini:
- Keep effort defaults aligned with Codex CLI presets (e.g. low/medium/high only, no
none where unsupported).
- For legacy GPT-5:
- Maintain current opencode defaults to avoid breaking existing configs, or provide a migration path.
-
Configuration helpers and docs
- Update any core docs/config generators that currently mention GPT-5-only presets to include GPT-5.1 examples.
- Add guidance for plugin authors on when to use GPT-5.1 vs GPT-5, and how
reasoningEffort: "none" should behave.
Why This Belongs in Core (not just plugins)
- Plugins can normalize model names and clamp reasoning values, but:
- The opencode model picker, config wizards, and validation logic remain unaware of GPT-5.1.
- Users configuring GPT-5.1 manually get no guidance or type safety from opencode itself.
- Making GPT-5.1 first-class in core ensures:
- Consistent UX across all OpenAI-based providers (not just the Codex plugin).
- Easier future migration if OpenAI deprecates parts of the GPT-5 family.
Acceptance Criteria
- Core config/schema supports
reasoningEffort: "none" for appropriate models.
- GPT-5.1 models appear in the opencode model picker with correct names and defaults.
- Providers like the Codex OAuth plugin can reference GPT-5.1 model IDs without triggering validation errors or confusing UX.
- Existing GPT-5-only configurations continue to work unchanged.
Summary
Add native GPT-5.1 model awareness and
reasoningEffort: "none"support to opencode core so providers like the Codex OAuth plugin can match the latest Codex CLI behavior without fighting the framework.Background
In
openai/codex(the official Codex CLI), GPT-5.1 is now the primary family, with:gpt-5.1general models supporting a newreasoningEffort: "none"mode.gpt-5.1-codexandgpt-5.1-codex-minifor code-heavy workloads.model_presets.rswith specific reasoning defaults.In opencode today (based on
gh search code):reasoningEffort: "none".In the Codex OAuth plugin (
@promethean-os/opencode-openai-codex-auth):gpt-5.1-codex-low,gpt-5.1-none) and adjusted plugin-side normalization + reasoning heuristics to match Codex CLI.Proposed Changes
Extend model configuration schema
reasoningEffortto include"none"in addition to"minimal" | "low" | "medium" | "high".Add GPT-5.1 model awareness to core
provider.openai):gpt-5.1,gpt-5.1-codex,gpt-5.1-codex-mini.Reasoning defaults aligned with Codex CLI
reasoningEffort: "none"(new no-reasoning mode) when not overridden.nonewhere unsupported).Configuration helpers and docs
reasoningEffort: "none"should behave.Why This Belongs in Core (not just plugins)
Acceptance Criteria
reasoningEffort: "none"for appropriate models.