ci(opencode): register openai/gpt-5.5 under cloudflare-ai-gateway#899
Conversation
Updates the bonk workflow to use cloudflare-ai-gateway/openai/gpt-5.5 with reasoning effort variant xhigh, replacing the previous claude-opus-4-7 model. https://claude.ai/code/session_012n6RFuqfQGQY7AaayqLcaG
- bigbonk workflow: cloudflare-ai-gateway/openai/gpt-5.5 with variant xhigh (was claude-opus-4-7 + max) - viguy and reviewer OpenCode agent definitions: openai/gpt-5.5 (was anthropic/claude-opus-4-7) - CONTRIBUTING.md: update recommended setup and BigBonk description to GPT-5.5 xhigh The Next.js tracker workflow and agent are intentionally left on Claude Opus 4.7. https://claude.ai/code/session_012n6RFuqfQGQY7AaayqLcaG
…kflow-9OzsQ ci(bonk): switch bonk + bigbonk to GPT-5.5 xhigh
bonk and bigbonk currently throw ProviderModelNotFoundError when invoked because opencode's bundled models snapshot predates GPT-5.5 (released 2026-04-23) and the cloudflare-ai-gateway entry on models.dev does not yet list it. opencode resolves provider.models[modelID] before any network call, so the lookup miss fires synchronously and the workflow exits non-zero before reaching the gateway. opencode merges user-supplied provider.<id>.models into the resolved registry dict, so registering the entry locally unblocks workflow invocations without waiting on either an upstream registry merge or an opencode release that rebundles the snapshot. Pricing, limits, modalities, and reasoning capability mirror the direct openai/gpt-5.5 entry from models.dev.
commit: |
There was a problem hiding this comment.
Pull request overview
This PR updates the repo’s OpenCode/ask-bonk configuration to use openai/gpt-5.5 via the cloudflare-ai-gateway provider, and adds a local opencode.json override to unblock model lookup failures caused by an outdated bundled models snapshot.
Changes:
- Add
opencode.jsonregisteringopenai/gpt-5.5undercloudflare-ai-gateway. - Switch OpenCode agents (
viguy,reviewer) and Bonk/BigBonk workflows toopenai/gpt-5.5withvariant: xhigh. - Update contributor guidance to reflect the GPT-5.5-based setup.
Reviewed changes
Copilot reviewed 6 out of 6 changed files in this pull request and generated 4 comments.
Show a summary per file
| File | Description |
|---|---|
| opencode.json | Adds a local provider-model registration for cloudflare-ai-gateway/openai/gpt-5.5 to avoid ProviderModelNotFoundError. |
| CONTRIBUTING.md | Updates recommended OpenCode model and AI review model references to GPT-5.5/xhigh. |
| .opencode/agents/viguy.md | Switches agent model to openai/gpt-5.5. |
| .opencode/agents/reviewer.md | Switches agent model to openai/gpt-5.5. |
| .github/workflows/bonk.yml | Updates Bonk workflow to use GPT-5.5 and sets variant: xhigh. |
| .github/workflows/bigbonk.yml | Updates BigBonk workflow to use GPT-5.5 and switches variant to xhigh. |
💡 Add Copilot custom instructions for smarter, more guided reviews. Learn how to get started.
| "release_date": "2026-04-23", | ||
| "attachment": true, | ||
| "reasoning": true, | ||
| "temperature": false, |
There was a problem hiding this comment.
In this model registration, temperature is set to false (i.e., the model does not support a temperature parameter), but the repo’s OpenCode agents still declare temperature values. If OpenCode validates capabilities, this can cause a config/runtime error or silently ignore agent settings. Consider either marking temperature as supported here (if GPT-5.5 accepts it via the gateway) or removing/omitting temperature from agents that run on openai/gpt-5.5.
| "temperature": false, | |
| "temperature": true, |
| mode: primary | ||
| model: anthropic/claude-opus-4-7 | ||
| model: openai/gpt-5.5 | ||
| temperature: 0.2 |
There was a problem hiding this comment.
The agent now targets openai/gpt-5.5, but still sets temperature: 0.2. In this PR’s opencode.json registration for openai/gpt-5.5, temperature is declared as unsupported (false). If OpenCode enforces that capability, the agent config may error or have its temperature silently dropped—please align these settings.
| temperature: 0.2 |
| mode: subagent | ||
| model: anthropic/claude-opus-4-7 | ||
| model: openai/gpt-5.5 | ||
| temperature: 0.1 |
There was a problem hiding this comment.
The agent now targets openai/gpt-5.5, but still sets temperature: 0.1. This PR’s opencode.json entry for openai/gpt-5.5 declares temperature as unsupported (false), which may cause config validation failures or ignored settings. Please align agent frontmatter with the model capability you’re registering.
| temperature: 0.1 |
| ## Recommended setup | ||
|
|
||
| We use [OpenCode](https://opencode.ai) with Claude Opus 4.6, set to max thinking. This is the same setup that built the project. | ||
| We use [OpenCode](https://opencode.ai) with GPT-5.5, set to xhigh reasoning effort. This is the same setup that built the project. |
There was a problem hiding this comment.
This sentence claims the current OpenCode+GPT-5.5 setup is “the same setup that built the project,” but the README still states the repo was largely written with Claude Code. To avoid contradictory contributor guidance, consider rephrasing this to “the current recommended setup” (or update the other doc to match).
| We use [OpenCode](https://opencode.ai) with GPT-5.5, set to xhigh reasoning effort. This is the same setup that built the project. | |
| We use [OpenCode](https://opencode.ai) with GPT-5.5, set to xhigh reasoning effort. This is our current recommended setup. |
bonk and bigbonk currently throw ProviderInitError on every invocation after the gpt-5.5 switch in #898. The lookup-side regression that #899 patched was masked by the opencode bump (1.14.22 -> 1.14.25) but a second failure now surfaces during SDK init for cloudflare-ai-gateway. opencode swallows the underlying cause, and the gpt-5.5 path through ai-gateway-provider has no working precedent in this repo, so iterating on it would block PR review indefinitely. Roll the model back to gpt-5.4, which has a known-good path through opencode + ai-gateway-provider + Cloudflare AI Gateway, and remove the opencode.json workaround introduced in #899. Reasoning effort stays at "xhigh" per James's preference; opencode will fall back to its default effort if gpt-5.4 doesn't accept it. Once gpt-5.5 SDK init is fixed upstream (opencode + ai-gateway-provider + models.dev cloudflare-ai-gateway entry), we can roll forward again.
What
Add
opencode.jsonregisteringopenai/gpt-5.5under the built-incloudflare-ai-gatewayprovider, mirroring the upstream models.dev spec.Why
After #898, bonk/bigbonk throw
ProviderModelNotFoundError. opencode's bundled models snapshot predates GPT-5.5 (2026-04-23) and thecloudflare-ai-gatewayentry on models.dev does not list it yet, soprovider.models["openai/gpt-5.5"]lookup fails before any network call. Failing run: https://github.com/cloudflare/vinext/actions/runs/24938461196.opencode merges user-supplied provider models into the resolved dict, so this entry unblocks invocations without an opencode release.
Tradeoffs
/bigbonkstill fails after merge, revert + downgrade togpt-5.4.variant: xhighreasoning effort mapping for a custom-registered model is untested; may fall back to default effort.Removal plan
@james-elicx fyi. I'm tracking anomalyco/models.dev#1596 (cloudflare-ai-gateway/openai/gpt-5.5 entry). Once that lands AND opencode publishes a release with a refreshed snapshot, I'll delete
opencode.jsonin a follow-up.Validation
packages/opencode/src/provider/provider.ts)./bigbonk reviewbefore merging to main.