ci(bonk): downgrade bots from gpt-5.5 to gpt-5.4#900
Conversation
bonk and bigbonk currently throw ProviderInitError on every invocation after the gpt-5.5 switch in cloudflare#898. The lookup-side regression that cloudflare#899 patched was masked by the opencode bump (1.14.22 -> 1.14.25) but a second failure now surfaces during SDK init for cloudflare-ai-gateway. opencode swallows the underlying cause, and the gpt-5.5 path through ai-gateway-provider has no working precedent in this repo, so iterating on it would block PR review indefinitely. Roll the model back to gpt-5.4, which has a known-good path through opencode + ai-gateway-provider + Cloudflare AI Gateway, and remove the opencode.json workaround introduced in cloudflare#899. Reasoning effort stays at "xhigh" per James's preference; opencode will fall back to its default effort if gpt-5.4 doesn't accept it. Once gpt-5.5 SDK init is fixed upstream (opencode + ai-gateway-provider + models.dev cloudflare-ai-gateway entry), we can roll forward again.
commit: |
There was a problem hiding this comment.
Pull request overview
This PR restores Bonk/BigBonk functionality by rolling the bots back from gpt-5.5 to the known-working gpt-5.4 on Cloudflare AI Gateway, and removes the temporary opencode.json model-registry workaround that was added to unblock gpt-5.5.
Changes:
- Downgrade Bonk and BigBonk workflows to
cloudflare-ai-gateway/openai/gpt-5.4(keepingvariant: xhigh). - Update OpenCode agent configs to
openai/gpt-5.4. - Delete
opencode.jsonand update CONTRIBUTING docs to match the new baseline.
Reviewed changes
Copilot reviewed 6 out of 6 changed files in this pull request and generated no comments.
Show a summary per file
| File | Description |
|---|---|
opencode.json |
Removes the custom model registry entry for openai/gpt-5.5 (no longer needed after rollback). |
CONTRIBUTING.md |
Updates contributor guidance to GPT-5.4 + xhigh reasoning effort. |
.opencode/agents/viguy.md |
Pins the agent model to openai/gpt-5.4. |
.opencode/agents/reviewer.md |
Pins the reviewer subagent model to openai/gpt-5.4. |
.github/workflows/bonk.yml |
Uses cloudflare-ai-gateway/openai/gpt-5.4 for /bonk. |
.github/workflows/bigbonk.yml |
Uses cloudflare-ai-gateway/openai/gpt-5.4 for /bigbonk. |
💡 Add Copilot custom instructions for smarter, more guided reviews. Learn how to get started.
Every bonk and bigbonk run on opencode 1.14.25 throws UnknownError: ProviderInitError during cf-ai-gateway SDK init, regardless of the model selected (verified across gpt-5.5, gpt-5.4, and earlier opus runs in cloudflare#898/cloudflare#900). The throw fires at the provider-init layer, so it isn't a model-specific or registry issue. workers-sdk hit the same regression and pinned opencode_version to 1.4.6 with an inline comment naming ProviderInitError as the cause. Several other Cloudflare repos pin 1.2.27 / 1.4.6 for the same reason. vinext is the only Cloudflare repo running OAI through cf-ai-gateway, so the OAI-specific path stayed unobserved until cloudflare#898 re-tried it on 1.14.25. Pinning to 1.4.6 here reuses the workers-sdk known-good version. The gpt-5.4 model and xhigh variant from the prior commit stay; gpt-5.4 predates 1.4.6's bundled snapshot by over a month, so registry resolution is fine.
|
@james-elicx update — pushed an opencode pin on top of the gpt-5.4 downgrade. Root cause confirmed: the regression is opencode 1.14.25, not the model. After tracing through every CF repo using cf-ai-gateway, workers-sdk has an inline comment pinning This PR now does:
Treat this as the last attempt at OAI models on cf-ai-gateway. If Tracking upstream: anomalyco/models.dev#1596 (registry entry), and the still-undiagnosed cf-ai-gateway init regression in opencode 1.14.25. |
…ror (#902) Every bonk and bigbonk run since opencode 1.14.25 throws UnknownError: ProviderInitError during cf-ai-gateway SDK init, regardless of model. Verified across gpt-5.5 (#898), gpt-5.4 (#900), and the throw fires before any per-model code path runs. workers-sdk hit the same regression and pinned opencode_version to 1.4.6 with an inline comment naming ProviderInitError as the cause. Several other Cloudflare repos pin 1.2.27 / 1.4.6 for the same reason. vinext is the only Cloudflare repo running OAI through cf-ai-gateway, so the OAI-specific path stayed unobserved until #898 re-tried it on 1.14.25. Pinning to 1.4.6 here reuses workers-sdk's known-good version. The gpt-5.4 model from #900 stays; gpt-5.4 predates 1.4.6's bundled snapshot by over a month, so registry resolution is fine.
What
Roll bonk + bigbonk back to
cloudflare-ai-gateway/openai/gpt-5.4. Remove theopencode.jsonworkaround introduced in #899.Why
Every
/bonkand/bigbonkinvocation since #898 fails. Two stages:ProviderModelNotFoundErrorbecause models.dev'scloudflare-ai-gatewayentry doesn't list gpt-5.5. ci(opencode): register openai/gpt-5.5 under cloudflare-ai-gateway #899 patched that by registering the model inopencode.json.UnknownError: ProviderInitError. opencode swallows thecause, so the underlying reason isn't visible in the run log.Failing run after both #898 and #899 landed: https://github.com/cloudflare/vinext/actions/runs/24938979889.
gpt-5.4 has a known-good path through opencode +
ai-gateway-provider+ Cloudflare AI Gateway, so this restores bots immediately while the gpt-5.5 SDK-init issue gets diagnosed upstream.Approach
openai/gpt-5.5→openai/gpt-5.4.xhigh. opencode falls back to its default effort if gpt-5.4 rejects it; if that turns out to be a problem we can drop tohighin a follow-up.opencode.json(no longer needed).Validation
/bigbonk reviewon this PR before merging to confirm bots come back up.