ci(bonk): bump opencode to 1.4.11 to unblock OAI through cf-ai-gateway#903
Merged
james-elicx merged 1 commit intocloudflare:mainfrom Apr 25, 2026
Merged
Conversation
Bots throw "max_tokens not supported, use max_completion_tokens" on 1.4.6 with gpt-5.x reasoning models. opencode 1.4.6 predates the cf-ai-gateway plugin's chat.params hook (PR #22864, landed in 1.4.7), which strips maxOutputTokens for OpenAI reasoning models so the @ai-sdk/openai-compatible default no longer leaks the rejected field. The earlier ProviderInitError on 1.14.x was a separate regression in opencode's npm install path: the major bump introduced loadOptions(dir) via @npmcli/config inside Npm.add, which fails on CI runners and bubbles up as InstallFailedError, then gets re-wrapped at provider.ts:1522 as ProviderInitError with the original cause swallowed by the session error path. Other Cloudflare repos (workers-sdk, workerd, kumo) hit the same regression on anthropic and pinned 1.4.6 / 1.2.27 to dodge it. 1.4.11 is the last 1.4.x release: it has the chat.params max_tokens patch and predates the loadOptions install regression. Same ai-gateway-provider 3.1.2, same @ai-sdk pins as 1.4.6, so the install path stays the known-good arborist.reify call shape that works for workers-sdk + anthropic on 1.4.6. Verified locally with opencode 1.14.25 + cf-ai-gateway/openai/gpt-5.4 + fake creds: empty-cache short-circuit and loadOptions both reproduce the silent ProviderInitError. 1.4.11 lacks both code paths.
commit: |
There was a problem hiding this comment.
Pull request overview
Updates the Bonk and Big Bonk GitHub Actions workflows to use a newer opencode 1.4.x release, intended to restore OpenAI (via Cloudflare AI Gateway) compatibility without taking the problematic 1.14.x install-path regression.
Changes:
- Bump
opencode_versionfrom1.4.6to1.4.11inbonk.yml. - Bump
opencode_versionfrom1.4.6to1.4.11inbigbonk.yml.
Reviewed changes
Copilot reviewed 2 out of 2 changed files in this pull request and generated no comments.
| File | Description |
|---|---|
.github/workflows/bonk.yml |
Updates opencode_version to 1.4.11 for /bonk runs. |
.github/workflows/bigbonk.yml |
Updates opencode_version to 1.4.11 for /bigbonk runs. |
💡 Add Copilot custom instructions for smarter, more guided reviews. Learn how to get started.
This file contains hidden or bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Add this suggestion to a batch that can be applied as a single commit.This suggestion is invalid because no changes were made to the code.Suggestions cannot be applied while the pull request is closed.Suggestions cannot be applied while viewing a subset of changes.Only one suggestion per line can be applied in a batch.Add this suggestion to a batch that can be applied as a single commit.Applying suggestions on deleted lines is not supported.You must change the existing code in this line in order to create a valid suggestion.Outdated suggestions cannot be applied.This suggestion has been applied or marked resolved.Suggestions cannot be applied from pending reviews.Suggestions cannot be applied on multi-line comments.Suggestions cannot be applied while the pull request is queued to merge.Suggestion cannot be applied right now. Please check back later.
What
Bump
opencode_versionfrom1.4.6to1.4.11in bothbonk.ymlandbigbonk.yml. Model + variant unchanged (cloudflare-ai-gateway/openai/gpt-5.4,xhigh).Why
Two stacked regressions in opencode have made bonk on OAI through cf-ai-gateway uniquely broken on this repo. #902 dodged one and reintroduced the other.
Current state (1.4.6 + gpt-5.4):
Failing run after #902: https://github.com/cloudflare/vinext/actions/runs/24939826275
opencode 1.4.6 predates the cf-ai-gateway plugin's
chat.paramshook (anomalyco/opencode#22864, merged 2026-04-16, first release 1.4.7). The hook stripsmaxOutputTokensfor OpenAI reasoning models so@ai-sdk/openai-compatibleno longer emits the rejectedmax_tokensfield. Without it, every gpt-5.x request through the unified gateway fails at the OAI edge.Why not 1.14.x:
ProviderInitErroron 1.14.22 / 1.14.25 (#898, #900) is a separate regression in opencode's npm install path. The 1.14 major bump addedloadOptions(dir)via@npmcli/configinsideNpm.add. When that throws on a CI runner, it surfaces asInstallFailedError, gets re-wrapped atpackages/opencode/src/provider/provider.ts:1522asInitError({providerID}, {cause: e}), and the session error path drops the originalcausewhen emittingNamedError.Unknown({message: err.message}). Result: bots seeUnknownError: ProviderInitErrorwith no actionable detail. Same issue hit workers-sdk, workerd, kumo, etc., which all pin1.4.6/1.2.27for that reason.Why 1.4.11: last release on the 1.4 line. Has the
chat.paramsmax_tokens patch (1.4.7+). Pre-dates the 1.14.xloadOptionsinstall regression.ai-gateway-provider@3.1.2and the@ai-sdk/*pins are identical to 1.4.6, so the install path stays the samearborist.reifyshape that works for workers-sdk + anthropic on 1.4.6. vinext is the only Cloudflare repo running OAI through cf-ai-gateway, which is why we need the gpt-5.x plugin patch and they don't.Approach
bonk.yml:opencode_version: 1.4.6->1.4.11bigbonk.yml:opencode_version: 1.4.6->1.4.11Validation
Reproduced both failure modes locally with opencode 1.14.25 +
cloudflare-ai-gateway/openai/gpt-5.4+ fake creds. Confirmed:Npm.addshort-circuits on a stale empty cache dir (one path) andloadOptionsfails on fresh install (other path), both surface as silentProviderInitError.max_tokens.chat.paramshook that dropsmaxOutputTokensfor OAI reasoning models.Will comment
/bigbonk reviewon this PR before merging to confirm bots come back up. If 1.4.11 still throwsProviderInitErrorfor any other reason, fall back to anthropic + 1.4.6 (matches workers-sdk known-good).Risks / follow-ups
loadOptionsis fixed (no upstream issue tracking it yet, since it surfaces as the maskedProviderInitError).cause-swallowing in opencode's session error path is a real diagnostic bug worth filing upstream.