[PR #903] [MERGED] ci(bonk): bump opencode to 1.4.11 to unblock OAI through cf-ai-gateway #935

Closed
opened 2026-05-06 13:10:55 +02:00 by BreizhHardware · 0 comments

📋 Pull Request Information

Original PR: https://github.com/cloudflare/vinext/pull/903
Author: @NathanDrake2406
Created: 4/25/2026
Status: Merged
Merged: 4/25/2026
Merged by: @james-elicx

Base: mainHead: nathan/bump-opencode-1.4.11


📝 Commits (1)

  • 7f8c91e ci(bonk): bump opencode to 1.4.11 to unblock OAI through cf-ai-gateway

📊 Changes

2 files changed (+2 additions, -2 deletions)

View changed files

📝 .github/workflows/bigbonk.yml (+1 -1)
📝 .github/workflows/bonk.yml (+1 -1)

📄 Description

What

Bump opencode_version from 1.4.6 to 1.4.11 in both bonk.yml and bigbonk.yml. Model + variant unchanged (cloudflare-ai-gateway/openai/gpt-5.4, xhigh).

Why

Two stacked regressions in opencode have made bonk on OAI through cf-ai-gateway uniquely broken on this repo. #902 dodged one and reintroduced the other.

Current state (1.4.6 + gpt-5.4):

APIError: Unsupported parameter: 'max_tokens' is not supported with this model.
Use 'max_completion_tokens' instead.
URL: https://gateway.ai.cloudflare.com/v1/compat/chat/completions

Failing run after #902: https://github.com/cloudflare/vinext/actions/runs/24939826275

opencode 1.4.6 predates the cf-ai-gateway plugin's chat.params hook (anomalyco/opencode#22864, merged 2026-04-16, first release 1.4.7). The hook strips maxOutputTokens for OpenAI reasoning models so @ai-sdk/openai-compatible no longer emits the rejected max_tokens field. Without it, every gpt-5.x request through the unified gateway fails at the OAI edge.

Why not 1.14.x: ProviderInitError on 1.14.22 / 1.14.25 (#898, #900) is a separate regression in opencode's npm install path. The 1.14 major bump added loadOptions(dir) via @npmcli/config inside Npm.add. When that throws on a CI runner, it surfaces as InstallFailedError, gets re-wrapped at packages/opencode/src/provider/provider.ts:1522 as InitError({providerID}, {cause: e}), and the session error path drops the original cause when emitting NamedError.Unknown({message: err.message}). Result: bots see UnknownError: ProviderInitError with no actionable detail. Same issue hit workers-sdk, workerd, kumo, etc., which all pin 1.4.6 / 1.2.27 for that reason.

Why 1.4.11: last release on the 1.4 line. Has the chat.params max_tokens patch (1.4.7+). Pre-dates the 1.14.x loadOptions install regression. ai-gateway-provider@3.1.2 and the @ai-sdk/* pins are identical to 1.4.6, so the install path stays the same arborist.reify shape that works for workers-sdk + anthropic on 1.4.6. vinext is the only Cloudflare repo running OAI through cf-ai-gateway, which is why we need the gpt-5.x plugin patch and they don't.

Approach

  • bonk.yml: opencode_version: 1.4.6 -> 1.4.11
  • bigbonk.yml: opencode_version: 1.4.6 -> 1.4.11
  • No model, variant, or env changes

Validation

Reproduced both failure modes locally with opencode 1.14.25 + cloudflare-ai-gateway/openai/gpt-5.4 + fake creds. Confirmed:

  • 1.14.x: Npm.add short-circuits on a stale empty cache dir (one path) and loadOptions fails on fresh install (other path), both surface as silent ProviderInitError.
  • 1.4.6 with gpt-5.4: install succeeds, request reaches OAI, OAI rejects max_tokens.
  • 1.4.7+ adds the chat.params hook that drops maxOutputTokens for OAI reasoning models.

Will comment /bigbonk review on this PR before merging to confirm bots come back up. If 1.4.11 still throws ProviderInitError for any other reason, fall back to anthropic + 1.4.6 (matches workers-sdk known-good).

Risks / follow-ups

  • 1.4 line is no longer maintained upstream. We should roll forward once loadOptions is fixed (no upstream issue tracking it yet, since it surfaces as the masked ProviderInitError).
  • The cause-swallowing in opencode's session error path is a real diagnostic bug worth filing upstream.

🔄 This issue represents a GitHub Pull Request. It cannot be merged through Gitea due to API limitations.

## 📋 Pull Request Information **Original PR:** https://github.com/cloudflare/vinext/pull/903 **Author:** [@NathanDrake2406](https://github.com/NathanDrake2406) **Created:** 4/25/2026 **Status:** ✅ Merged **Merged:** 4/25/2026 **Merged by:** [@james-elicx](https://github.com/james-elicx) **Base:** `main` ← **Head:** `nathan/bump-opencode-1.4.11` --- ### 📝 Commits (1) - [`7f8c91e`](https://github.com/cloudflare/vinext/commit/7f8c91e48208d9728b166420ba0ef3af17633f17) ci(bonk): bump opencode to 1.4.11 to unblock OAI through cf-ai-gateway ### 📊 Changes **2 files changed** (+2 additions, -2 deletions) <details> <summary>View changed files</summary> 📝 `.github/workflows/bigbonk.yml` (+1 -1) 📝 `.github/workflows/bonk.yml` (+1 -1) </details> ### 📄 Description ## What Bump `opencode_version` from `1.4.6` to `1.4.11` in both `bonk.yml` and `bigbonk.yml`. Model + variant unchanged (`cloudflare-ai-gateway/openai/gpt-5.4`, `xhigh`). ## Why Two stacked regressions in opencode have made bonk on OAI through cf-ai-gateway uniquely broken on this repo. #902 dodged one and reintroduced the other. **Current state (1.4.6 + gpt-5.4):** ``` APIError: Unsupported parameter: 'max_tokens' is not supported with this model. Use 'max_completion_tokens' instead. URL: https://gateway.ai.cloudflare.com/v1/compat/chat/completions ``` Failing run after #902: https://github.com/cloudflare/vinext/actions/runs/24939826275 opencode 1.4.6 predates the cf-ai-gateway plugin's `chat.params` hook (anomalyco/opencode#22864, merged 2026-04-16, first release 1.4.7). The hook strips `maxOutputTokens` for OpenAI reasoning models so `@ai-sdk/openai-compatible` no longer emits the rejected `max_tokens` field. Without it, every gpt-5.x request through the unified gateway fails at the OAI edge. **Why not 1.14.x:** `ProviderInitError` on 1.14.22 / 1.14.25 (#898, #900) is a separate regression in opencode's npm install path. The 1.14 major bump added `loadOptions(dir)` via `@npmcli/config` inside `Npm.add`. When that throws on a CI runner, it surfaces as `InstallFailedError`, gets re-wrapped at `packages/opencode/src/provider/provider.ts:1522` as `InitError({providerID}, {cause: e})`, and the session error path drops the original `cause` when emitting `NamedError.Unknown({message: err.message})`. Result: bots see `UnknownError: ProviderInitError` with no actionable detail. Same issue hit workers-sdk, workerd, kumo, etc., which all pin `1.4.6` / `1.2.27` for that reason. **Why 1.4.11:** last release on the 1.4 line. Has the `chat.params` max_tokens patch (1.4.7+). Pre-dates the 1.14.x `loadOptions` install regression. `ai-gateway-provider@3.1.2` and the `@ai-sdk/*` pins are identical to 1.4.6, so the install path stays the same `arborist.reify` shape that works for workers-sdk + anthropic on 1.4.6. vinext is the only Cloudflare repo running OAI through cf-ai-gateway, which is why we need the gpt-5.x plugin patch and they don't. ## Approach - `bonk.yml`: `opencode_version: 1.4.6` -> `1.4.11` - `bigbonk.yml`: `opencode_version: 1.4.6` -> `1.4.11` - No model, variant, or env changes ## Validation Reproduced both failure modes locally with opencode 1.14.25 + `cloudflare-ai-gateway/openai/gpt-5.4` + fake creds. Confirmed: - 1.14.x: `Npm.add` short-circuits on a stale empty cache dir (one path) and `loadOptions` fails on fresh install (other path), both surface as silent `ProviderInitError`. - 1.4.6 with gpt-5.4: install succeeds, request reaches OAI, OAI rejects `max_tokens`. - 1.4.7+ adds the `chat.params` hook that drops `maxOutputTokens` for OAI reasoning models. Will comment `/bigbonk review` on this PR before merging to confirm bots come back up. If 1.4.11 still throws `ProviderInitError` for any other reason, fall back to anthropic + 1.4.6 (matches workers-sdk known-good). ## Risks / follow-ups - 1.4 line is no longer maintained upstream. We should roll forward once `loadOptions` is fixed (no upstream issue tracking it yet, since it surfaces as the masked `ProviderInitError`). - The `cause`-swallowing in opencode's session error path is a real diagnostic bug worth filing upstream. --- <sub>🔄 This issue represents a GitHub Pull Request. It cannot be merged through Gitea due to API limitations.</sub>
BreizhHardware 2026-05-06 13:10:55 +02:00
Sign in to join this conversation.
No milestone
No project
No assignees
1 participant
Notifications
Due date
The due date is invalid or out of range. Please use the format "yyyy-mm-dd".

No due date set.

Dependencies

No dependencies set.

Reference
starred/vinext#935
No description provided.