Skip to content

ci(bonk): bump opencode to 1.4.11 to unblock OAI through cf-ai-gateway#903

Merged
james-elicx merged 1 commit intocloudflare:mainfrom
NathanDrake2406:nathan/bump-opencode-1.4.11
Apr 25, 2026
Merged

ci(bonk): bump opencode to 1.4.11 to unblock OAI through cf-ai-gateway#903
james-elicx merged 1 commit intocloudflare:mainfrom
NathanDrake2406:nathan/bump-opencode-1.4.11

Conversation

@NathanDrake2406
Copy link
Copy Markdown
Contributor

What

Bump opencode_version from 1.4.6 to 1.4.11 in both bonk.yml and bigbonk.yml. Model + variant unchanged (cloudflare-ai-gateway/openai/gpt-5.4, xhigh).

Why

Two stacked regressions in opencode have made bonk on OAI through cf-ai-gateway uniquely broken on this repo. #902 dodged one and reintroduced the other.

Current state (1.4.6 + gpt-5.4):

APIError: Unsupported parameter: 'max_tokens' is not supported with this model.
Use 'max_completion_tokens' instead.
URL: https://gateway.ai.cloudflare.com/v1/compat/chat/completions

Failing run after #902: https://github.com/cloudflare/vinext/actions/runs/24939826275

opencode 1.4.6 predates the cf-ai-gateway plugin's chat.params hook (anomalyco/opencode#22864, merged 2026-04-16, first release 1.4.7). The hook strips maxOutputTokens for OpenAI reasoning models so @ai-sdk/openai-compatible no longer emits the rejected max_tokens field. Without it, every gpt-5.x request through the unified gateway fails at the OAI edge.

Why not 1.14.x: ProviderInitError on 1.14.22 / 1.14.25 (#898, #900) is a separate regression in opencode's npm install path. The 1.14 major bump added loadOptions(dir) via @npmcli/config inside Npm.add. When that throws on a CI runner, it surfaces as InstallFailedError, gets re-wrapped at packages/opencode/src/provider/provider.ts:1522 as InitError({providerID}, {cause: e}), and the session error path drops the original cause when emitting NamedError.Unknown({message: err.message}). Result: bots see UnknownError: ProviderInitError with no actionable detail. Same issue hit workers-sdk, workerd, kumo, etc., which all pin 1.4.6 / 1.2.27 for that reason.

Why 1.4.11: last release on the 1.4 line. Has the chat.params max_tokens patch (1.4.7+). Pre-dates the 1.14.x loadOptions install regression. ai-gateway-provider@3.1.2 and the @ai-sdk/* pins are identical to 1.4.6, so the install path stays the same arborist.reify shape that works for workers-sdk + anthropic on 1.4.6. vinext is the only Cloudflare repo running OAI through cf-ai-gateway, which is why we need the gpt-5.x plugin patch and they don't.

Approach

  • bonk.yml: opencode_version: 1.4.6 -> 1.4.11
  • bigbonk.yml: opencode_version: 1.4.6 -> 1.4.11
  • No model, variant, or env changes

Validation

Reproduced both failure modes locally with opencode 1.14.25 + cloudflare-ai-gateway/openai/gpt-5.4 + fake creds. Confirmed:

  • 1.14.x: Npm.add short-circuits on a stale empty cache dir (one path) and loadOptions fails on fresh install (other path), both surface as silent ProviderInitError.
  • 1.4.6 with gpt-5.4: install succeeds, request reaches OAI, OAI rejects max_tokens.
  • 1.4.7+ adds the chat.params hook that drops maxOutputTokens for OAI reasoning models.

Will comment /bigbonk review on this PR before merging to confirm bots come back up. If 1.4.11 still throws ProviderInitError for any other reason, fall back to anthropic + 1.4.6 (matches workers-sdk known-good).

Risks / follow-ups

  • 1.4 line is no longer maintained upstream. We should roll forward once loadOptions is fixed (no upstream issue tracking it yet, since it surfaces as the masked ProviderInitError).
  • The cause-swallowing in opencode's session error path is a real diagnostic bug worth filing upstream.

Bots throw "max_tokens not supported, use max_completion_tokens" on
1.4.6 with gpt-5.x reasoning models. opencode 1.4.6 predates the
cf-ai-gateway plugin's chat.params hook (PR #22864, landed in 1.4.7),
which strips maxOutputTokens for OpenAI reasoning models so the
@ai-sdk/openai-compatible default no longer leaks the rejected field.

The earlier ProviderInitError on 1.14.x was a separate regression in
opencode's npm install path: the major bump introduced loadOptions(dir)
via @npmcli/config inside Npm.add, which fails on CI runners and bubbles
up as InstallFailedError, then gets re-wrapped at provider.ts:1522 as
ProviderInitError with the original cause swallowed by the session
error path. Other Cloudflare repos (workers-sdk, workerd, kumo) hit the
same regression on anthropic and pinned 1.4.6 / 1.2.27 to dodge it.

1.4.11 is the last 1.4.x release: it has the chat.params max_tokens
patch and predates the loadOptions install regression. Same
ai-gateway-provider 3.1.2, same @ai-sdk pins as 1.4.6, so the install
path stays the known-good arborist.reify call shape that works for
workers-sdk + anthropic on 1.4.6.

Verified locally with opencode 1.14.25 + cf-ai-gateway/openai/gpt-5.4 +
fake creds: empty-cache short-circuit and loadOptions both reproduce
the silent ProviderInitError. 1.4.11 lacks both code paths.
Copilot AI review requested due to automatic review settings April 25, 2026 20:45
@pkg-pr-new
Copy link
Copy Markdown

pkg-pr-new Bot commented Apr 25, 2026

Open in StackBlitz

npm i https://pkg.pr.new/vinext@903

commit: 7f8c91e

Copy link
Copy Markdown

Copilot AI left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Pull request overview

Updates the Bonk and Big Bonk GitHub Actions workflows to use a newer opencode 1.4.x release, intended to restore OpenAI (via Cloudflare AI Gateway) compatibility without taking the problematic 1.14.x install-path regression.

Changes:

  • Bump opencode_version from 1.4.6 to 1.4.11 in bonk.yml.
  • Bump opencode_version from 1.4.6 to 1.4.11 in bigbonk.yml.

Reviewed changes

Copilot reviewed 2 out of 2 changed files in this pull request and generated no comments.

File Description
.github/workflows/bonk.yml Updates opencode_version to 1.4.11 for /bonk runs.
.github/workflows/bigbonk.yml Updates opencode_version to 1.4.11 for /bigbonk runs.

💡 Add Copilot custom instructions for smarter, more guided reviews. Learn how to get started.

@james-elicx james-elicx enabled auto-merge (squash) April 25, 2026 20:46
@james-elicx james-elicx merged commit f8e9d3a into cloudflare:main Apr 25, 2026
26 of 27 checks passed
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

3 participants