mirror of
https://github.com/cloudflare/vinext.git
synced 2026-05-09 00:09:23 +02:00
[PR #903] [MERGED] ci(bonk): bump opencode to 1.4.11 to unblock OAI through cf-ai-gateway #935
Labels
No labels
enhancement
enhancement
good first issue
help wanted
nextjs-tracking
nextjs-tracking
pull-request
No milestone
No project
No assignees
1 participant
Notifications
Due date
No due date set.
Dependencies
No dependencies set.
Reference
starred/vinext#935
Loading…
Add table
Add a link
Reference in a new issue
No description provided.
Delete branch "%!s()"
Deleting a branch is permanent. Although the deleted branch may continue to exist for a short time before it actually gets removed, it CANNOT be undone in most cases. Continue?
📋 Pull Request Information
Original PR: https://github.com/cloudflare/vinext/pull/903
Author: @NathanDrake2406
Created: 4/25/2026
Status: ✅ Merged
Merged: 4/25/2026
Merged by: @james-elicx
Base:
main← Head:nathan/bump-opencode-1.4.11📝 Commits (1)
7f8c91eci(bonk): bump opencode to 1.4.11 to unblock OAI through cf-ai-gateway📊 Changes
2 files changed (+2 additions, -2 deletions)
View changed files
📝
.github/workflows/bigbonk.yml(+1 -1)📝
.github/workflows/bonk.yml(+1 -1)📄 Description
What
Bump
opencode_versionfrom1.4.6to1.4.11in bothbonk.ymlandbigbonk.yml. Model + variant unchanged (cloudflare-ai-gateway/openai/gpt-5.4,xhigh).Why
Two stacked regressions in opencode have made bonk on OAI through cf-ai-gateway uniquely broken on this repo. #902 dodged one and reintroduced the other.
Current state (1.4.6 + gpt-5.4):
Failing run after #902: https://github.com/cloudflare/vinext/actions/runs/24939826275
opencode 1.4.6 predates the cf-ai-gateway plugin's
chat.paramshook (anomalyco/opencode#22864, merged 2026-04-16, first release 1.4.7). The hook stripsmaxOutputTokensfor OpenAI reasoning models so@ai-sdk/openai-compatibleno longer emits the rejectedmax_tokensfield. Without it, every gpt-5.x request through the unified gateway fails at the OAI edge.Why not 1.14.x:
ProviderInitErroron 1.14.22 / 1.14.25 (#898, #900) is a separate regression in opencode's npm install path. The 1.14 major bump addedloadOptions(dir)via@npmcli/configinsideNpm.add. When that throws on a CI runner, it surfaces asInstallFailedError, gets re-wrapped atpackages/opencode/src/provider/provider.ts:1522asInitError({providerID}, {cause: e}), and the session error path drops the originalcausewhen emittingNamedError.Unknown({message: err.message}). Result: bots seeUnknownError: ProviderInitErrorwith no actionable detail. Same issue hit workers-sdk, workerd, kumo, etc., which all pin1.4.6/1.2.27for that reason.Why 1.4.11: last release on the 1.4 line. Has the
chat.paramsmax_tokens patch (1.4.7+). Pre-dates the 1.14.xloadOptionsinstall regression.ai-gateway-provider@3.1.2and the@ai-sdk/*pins are identical to 1.4.6, so the install path stays the samearborist.reifyshape that works for workers-sdk + anthropic on 1.4.6. vinext is the only Cloudflare repo running OAI through cf-ai-gateway, which is why we need the gpt-5.x plugin patch and they don't.Approach
bonk.yml:opencode_version: 1.4.6->1.4.11bigbonk.yml:opencode_version: 1.4.6->1.4.11Validation
Reproduced both failure modes locally with opencode 1.14.25 +
cloudflare-ai-gateway/openai/gpt-5.4+ fake creds. Confirmed:Npm.addshort-circuits on a stale empty cache dir (one path) andloadOptionsfails on fresh install (other path), both surface as silentProviderInitError.max_tokens.chat.paramshook that dropsmaxOutputTokensfor OAI reasoning models.Will comment
/bigbonk reviewon this PR before merging to confirm bots come back up. If 1.4.11 still throwsProviderInitErrorfor any other reason, fall back to anthropic + 1.4.6 (matches workers-sdk known-good).Risks / follow-ups
loadOptionsis fixed (no upstream issue tracking it yet, since it surfaces as the maskedProviderInitError).cause-swallowing in opencode's session error path is a real diagnostic bug worth filing upstream.🔄 This issue represents a GitHub Pull Request. It cannot be merged through Gitea due to API limitations.