Description
opencode uses hard coded context limits values for gpt-5.5, ignoring values set in opencode.jsonc.
I suspect it's due to PR #24212 which added this to codex.ts:
// gpt-5.5 models temporarily have restricted context window size for codex plans
if (model.id.includes("gpt-5.5")) {
model.limit = {
context: 400_000,
//@ts-expect-error incorrect type for v1 sdk but works
input: 272_000,
output: 128_000,
}
}
It should probably either overwrite only when the values are not specified in the config, or use max(400_000, model.limit.context) (same for input/output) to allow lower values (which is useful because gpt-5.5 is very expensive)
Plugins
None
OpenCode version
1.14.28
Steps to reproduce
- Add this to ~/.config/opencode/opencode.jsonc in provider.openai.models:
- Run
opencode models openai --verbose and check context limits for both models. gpt-5.4 will respect the overrides, gpt-5.5 will not. Alternatively, start a conversation with gpt-5.5, you'll see that the context window stats are relative to 400k context window not 200k.
Screenshot and/or share link
No response
Operating System
No response
Terminal
No response
Description
opencode uses hard coded context limits values for gpt-5.5, ignoring values set in opencode.jsonc.
I suspect it's due to PR #24212 which added this to codex.ts:
It should probably either overwrite only when the values are not specified in the config, or use
max(400_000, model.limit.context)(same forinput/output) to allow lower values (which is useful because gpt-5.5 is very expensive)Plugins
None
OpenCode version
1.14.28
Steps to reproduce
opencode models openai --verboseand check context limits for both models. gpt-5.4 will respect the overrides, gpt-5.5 will not. Alternatively, start a conversation with gpt-5.5, you'll see that the context window stats are relative to 400k context window not 200k.Screenshot and/or share link
No response
Operating System
No response
Terminal
No response