-
Notifications
You must be signed in to change notification settings - Fork 9.3k
Closed as duplicate of#11086
Labels
Description
Description
The GPT-5.2 context limit for Github Copilot seems to be too low.
https://platform.openai.com/docs/models/gpt-5.2
400,000 context window
128,000 max output tokens
However in opencode, at roughly 90k tokens in the context, in opencode, GPT-5.2 is already 70% full, suggesting the context window is limited to 128k, using default build mode.
In contrast, the OpenAI GPT-5.2 definition is correct
"models": {
"gpt-5.2": {
"name": "GPT 5.2 (OAuth)",
"limit": {
"context": 272000,
"output": 128000
},
"modalities": {
"input": [
"text",
"image"
],
"output": [
"text"
]
},
"variants": {
"none": {
"reasoningEffort": "none",
"reasoningSummary": "auto",
"textVerbosity": "medium"
},
"low": {
"reasoningEffort": "low",
"reasoningSummary": "auto",
"textVerbosity": "medium"
},
"medium": {
"reasoningEffort": "medium",
"reasoningSummary": "auto",
"textVerbosity": "medium"
},
"high": {
"reasoningEffort": "high",
"reasoningSummary": "detailed",
"textVerbosity": "medium"
},
"xhigh": {
"reasoningEffort": "xhigh",
"reasoningSummary": "detailed",
"textVerbosity": "medium"
}
}
},
Plugins
No response
OpenCode version
v1.1.51
Steps to reproduce
No response
Screenshot and/or share link
Operating System
No response
Terminal
No response
Reactions are currently unavailable