Skip to content

Bug: prompt loop ID comparison still broken in v1.3.0 when running opencode web remotely (with workaround) #18807

@chnlkw

Description

@chnlkw

Description

The prompt loop exit condition in SessionPrompt.loop() still uses message ID lexicographic comparison in v1.3.0:

if (lastAssistant?.finish
    && !["tool-calls", "unknown"].includes(lastAssistant.finish)
    && lastUser.id < lastAssistant.id) {
    break;
}

This breaks when opencode web runs on a remote server (e.g. Kubernetes) and the browser is on a different machine. The user message ID is generated in the browser via Date.now(), while the assistant message ID is generated on the server. Any clock skew (even a few ms where the browser is ahead) causes lastUser.id > lastAssistant.id, so the loop never exits after finish=stop.

Symptoms:

  • claude-opus-4-6 (no prefill support): "This model does not support assistant message prefill" error on every turn
  • Models with prefill support (e.g. claude-opus-4-5): step 1 silently overwrites step 0 → blank/stuck session in Web UI

Confirmed by extracting from the v1.3.0 binary:

$ strings /usr/local/bin/opencode | grep 'lastUser.id < lastAssistant.id'
if (lastAssistant?.finish && !["tool-calls", "unknown"].includes(lastAssistant.finish) && lastUser.id < lastAssistant.id) {

This is the same root cause as #13768 and #17982. Community PRs #17010 and #17149 fix both backend and frontend sides but remain open and unreviewed — could these be considered for merge?

Workaround

For anyone running opencode web remotely, a minimal Node.js proxy that delays only prompt requests by the exact clock skew (all other traffic zero-delay passthrough):

"use strict";
const http = require("http");
const TARGET = { host: "127.0.0.1", port: parseInt(process.env.TARGET_PORT || "3000") };
const LISTEN_PORT = parseInt(process.env.LISTEN_PORT || "8080");
const PROMPT_RE = /\/session\/[^/]+\/prompt(?:_async)?$/;
const MASK48 = (1n << 48n) - 1n;

function calcDelay(messageID) {
  if (!messageID?.startsWith("msg_") || messageID.length < 16) return 0;
  const hex = messageID.slice(4, 16);
  if (!/^[0-9a-f]{12}$/i.test(hex)) return 0;
  try {
    const idEnc = BigInt("0x" + hex);
    const nowEnc = (BigInt(Date.now()) * 0x1000n + 1n) & MASK48;
    return idEnc > nowEnc ? Math.min(Number((idEnc - nowEnc) / 0x1000n), 5000) : 0;
  } catch { return 0; }
}

function proxy(req, res, body) {
  const p = http.request({ ...TARGET, method: req.method, path: req.url,
    headers: { ...req.headers, host: `${TARGET.host}:${TARGET.port}` }
  }, (r) => { res.writeHead(r.statusCode, r.headers); r.pipe(res); });
  p.on("error", () => { if (!res.headersSent) { res.writeHead(502); res.end("Bad Gateway"); } });
  body ? p.end(body) : req.pipe(p);
}

http.createServer((req, res) => {
  if (req.method === "POST" && PROMPT_RE.test(req.url)) {
    const chunks = [];
    req.on("data", (c) => chunks.push(c));
    req.on("end", () => {
      const body = Buffer.concat(chunks);
      let delay = 0;
      try { const j = JSON.parse(body); if (j.messageID) delay = calcDelay(j.messageID); } catch {}
      if (delay > 0) setTimeout(() => proxy(req, res, body), delay);
      else proxy(req, res, body);
    });
  } else proxy(req, res);
}).on("upgrade", (req, sock, head) => {
  const p = http.request({ ...TARGET, method: req.method, path: req.url, headers: req.headers });
  p.on("upgrade", (r, up, uh) => {
    let h = "HTTP/1.1 101 Switching Protocols\r\n";
    for (let i = 0; i < r.rawHeaders.length; i += 2) h += r.rawHeaders[i] + ": " + r.rawHeaders[i+1] + "\r\n";
    sock.write(h + "\r\n"); if (uh.length) sock.write(uh); if (head.length) up.write(head);
    sock.pipe(up); up.pipe(sock);
    sock.on("error", () => up.destroy()); up.on("error", () => sock.destroy());
  });
  p.on("error", () => sock.destroy());
  p.end();
}).listen(LISTEN_PORT, "0.0.0.0", () =>
  console.log("[proxy] :" + LISTEN_PORT + " -> " + TARGET.host + ":" + TARGET.port)
);

Usage: TARGET_PORT=3000 LISTEN_PORT=8080 node proxy.js, point browser at port 8080. The proxy extracts the 48-bit timestamp from messageID, compares with server Date.now(), and delays only when the browser clock is ahead.

Plugins

none

OpenCode version

1.3.0

Steps to reproduce

  1. Deploy opencode web on a remote server (e.g. Kubernetes container)
  2. Open the Web UI from a browser on a different machine
  3. Select any Anthropic model (e.g. claude-opus-4-6)
  4. Send any message
  5. Observe: step 0 completes with finish=stop, but the loop continues to step 1 which fails with prefill error (or produces a blank response for prefill-capable models)

The issue does NOT reproduce when browser and server share the same machine (local opencode web), because they share the same hardware clock.

Screenshot and/or share link

N/A — server-side bug, visible in --log-level INFO logs as step=1 firing after finish=stop on step 0

Operating System

Linux (Kubernetes, Debian bookworm container)

Terminal

N/A (Web UI)

Metadata

Metadata

Assignees

Labels

coreAnything pertaining to core functionality of the application (opencode server stuff)

Type

No type

Projects

No projects

Milestone

No milestone

Relationships

None yet

Development

No branches or pull requests

Issue actions