Skip to content

fix(ai-chat): don't crash when useAgent()'s HTTP URL isn't ready yet#1358

Merged
threepointone merged 3 commits intomainfrom
fix/ai-chat-pending-http-url
Apr 23, 2026
Merged

fix(ai-chat): don't crash when useAgent()'s HTTP URL isn't ready yet#1358
threepointone merged 3 commits intomainfrom
fix/ai-chat-pending-http-url

Conversation

@threepointone
Copy link
Copy Markdown
Contributor

@threepointone threepointone commented Apr 22, 2026

Summary

Closes #1356.

useAgentChat() called new URL(agent.getHttpUrl()) unconditionally during render. useAgent() builds getHttpUrl() from PartySocket internals that can legitimately still be "" before the WebSocket connects, which happens most often when the agent is behind a proxy or reached via basePath/custom routing. That threw TypeError: Failed to construct 'URL': Invalid URL and crashed the component on first render, well before the handshake completed.

This PR treats "HTTP URL not ready yet" as a first-class state in useAgentChat():

  • Guards new URL(...) so an empty result just yields null.
  • Defers the built-in /get-messages fetch until the URL is available, and the default fetcher returns [] if called with an empty URL.
  • Keeps custom getInitialMessages callbacks runnable before the URL exists. GetInitialMessagesOptions.url is now string | undefined; callers that previously typed url: string should widen to url?: string.
  • Stabilizes the useChat id across the URL-arrival transition so the AI SDK doesn't recreate the underlying Chat (and abandon in-flight resume) just because the URL materialized on render 2.
  • Seeds initial messages exactly once when the load promise resolves after mount (only if the chat is empty), and marks the current cache key as seeded from clearHistory(), CF_AGENT_CHAT_CLEAR, and explicit setMessages([]) so user-driven clears aren't re-hydrated.

No API break. Apps where getHttpUrl() was synchronously available on first render are unchanged.

Test plan

  • Added "should wait for a valid HTTP URL before fetching initial messages" — starts with an empty URL, verifies no fetch() happens, then makes the URL available and confirms /get-messages is called exactly once with the normalized HTTP URL.
  • Added "should allow custom initial message loaders before the HTTP URL is ready" — verifies a user-provided getInitialMessages runs with url: undefined and seeds the chat.
  • npm run test:react in packages/ai-chat — 47/47 passing.
  • npm run check at the repo root — formatting, lints, exports, and all 75 typecheck projects pass.

Changeset

A patch-level changeset for @cloudflare/ai-chat is included at .changeset/ai-chat-pending-http-url.md.

Made with Cursor


Open in Devin Review

`useAgentChat()` called `new URL(agent.getHttpUrl())` unconditionally,
which threw on first render whenever `useAgent()` hadn't populated its
WebSocket URL yet — most commonly behind a proxy or with custom-routed
workers. See #1356.

Guards the URL parse, defers the built-in `/get-messages` fetch until
the socket URL is known, keeps custom `getInitialMessages` loaders
working with `url?: string`, stabilizes the underlying `useChat` `id`
across the URL-arrival transition, and seeds messages exactly once
when the URL becomes available — without clobbering user-driven clears.

Made-with: Cursor
@changeset-bot
Copy link
Copy Markdown

changeset-bot Bot commented Apr 22, 2026

🦋 Changeset detected

Latest commit: 37e0093

The changes in this PR will be included in the next version bump.

This PR includes changesets to release 1 package
Name Type
@cloudflare/ai-chat Patch

Not sure what this means? Click here to learn what changesets are.

Click here if you're a maintainer who wants to add another changeset to this PR

@pkg-pr-new
Copy link
Copy Markdown

pkg-pr-new Bot commented Apr 22, 2026

Open in StackBlitz

agents

npm i https://pkg.pr.new/agents@1358

@cloudflare/ai-chat

npm i https://pkg.pr.new/@cloudflare/ai-chat@1358

@cloudflare/codemode

npm i https://pkg.pr.new/@cloudflare/codemode@1358

hono-agents

npm i https://pkg.pr.new/hono-agents@1358

@cloudflare/shell

npm i https://pkg.pr.new/@cloudflare/shell@1358

@cloudflare/think

npm i https://pkg.pr.new/@cloudflare/think@1358

@cloudflare/voice

npm i https://pkg.pr.new/@cloudflare/voice@1358

@cloudflare/worker-bundler

npm i https://pkg.pr.new/@cloudflare/worker-bundler@1358

commit: 37e0093

devin-ai-integration[bot]

This comment was marked as resolved.

…transition

The request-dedup cache was keyed by origin+pathname+identity, so when the
WebSocket URL transitioned from empty to resolved on the second render,
`doGetInitialMessages` missed the cache, called the custom loader a second
time, and `use(newPromise)` re-triggered Suspense — the user saw a loading
fallback flash even though messages were already displayed.

Cache by agent identity only. The URL-aware key survives as
`resolvedInitialMessagesCacheKey` for the `stableChatIdRef` logic, which
still needs to distinguish "URL arrived" from "identity changed."

Regression test locked in: `should invoke custom getInitialMessages only
once across the HTTP URL transition`.

Made-with: Cursor
@threepointone
Copy link
Copy Markdown
Contributor Author

Good catch — confirmed the reproduction: with a custom getInitialMessages, the first render cached the promise under the identity-only key (URL was empty), and the second render (URL resolved) built a URL-prefixed cache key, missed the cache, re-invoked the loader, and use(newPromise) re-triggered Suspense after the chat was already populated.

Fixed in 5f6e794 by caching initial messages by agent identity only. The URL-aware key still exists as resolvedInitialMessagesCacheKey for the stableChatIdRef upgrade logic (which separately needs to tell "URL arrived" apart from "identity changed"), but requestCache and the late-seed guard both key on identity.

Added a regression test — should invoke custom getInitialMessages only once across the HTTP URL transition — that fails on the previous commit and passes on the fix. Changeset description is also updated to call out the identity-only caching.

devin-ai-integration[bot]

This comment was marked as resolved.

…ady has the messages

When the HTTP URL was available on the very first render, `useChat`
seeded its Chat with `initialMessages` directly, so `chatMessages.length > 0`
by the time the late-seed effect first ran. The effect short-circuited
without calling `markInitialMessagesSeeded()`, leaving the ref at `null`
forever — which meant any subsequent path that emptied the chat without
going through the wrapper (most notably a server-originated
`CF_AGENT_CHAT_MESSAGES` broadcast with `[]`, e.g. another tab calling
`setMessages([])`) would trip all three guards and re-hydrate the stale
initial messages on top of the clear.

Mark the key seeded on every observation where the chat is in a settled
state for this identity, not just when we actively inject messages.

Regression test: `should not re-hydrate initial messages when a server
broadcast empties the chat`.

Made-with: Cursor
@threepointone
Copy link
Copy Markdown
Contributor Author

Valid too — confirmed the re-hydration race with a failing test before the fix.

Reproduction: getHttpUrl() returns a valid URL on first render → useChat({ messages: initialMessages }) seeds the Chat directly → chatMessages.length > 0 by the time the late-seed effect first runs → the effect early-returned without calling markInitialMessagesSeeded(), so the seeded-key ref stayed null forever. Any later path that emptied chatMessages without going through the wrapper (notably the CF_AGENT_CHAT_MESSAGES server broadcast handler, which does setMessages(preserveProtectedStreamingAssistant(data.messages))) tripped all three guards and re-hydrated the stale initial messages.

Fixed in 37e0093 by marking the cache key seeded on every settled observation, not just when we actively inject messages. The chat-is-already-populated branch now marks and returns.

Regression test: should not re-hydrate initial messages when a server broadcast empties the chat. It fails on 5f6e794 and passes on 37e0093.

@threepointone
Copy link
Copy Markdown
Contributor Author

Good observation, and the conclusion is correct. Two structural guarantees back it up beyond "works in practice":

  1. React Suspense blocks effect commit. If use(initialMessagesPromise) suspends on first render, the component never commits, so neither the late-seed effect nor the onAgentMessage listener attach run. The listener can't see a broadcast before initial messages settle.

  2. Top-down effect declaration order. Once the component does commit, React runs effects in top-down declaration order. The late-seed effect is declared at react.tsx:984; the onAgentMessage listener effect is declared at react.tsx:1410. So late-seed always marks the cache key seeded before the message listener is attached. Any subsequent CF_AGENT_CHAT_MESSAGES broadcast goes through the already-set guard.

So the ordering dependency isn't a timing race — it's enforced by React.

One edge case that isn't protected, not flagged in the comment but worth being honest about: two useAgentChat hooks mounted concurrently against the same agent identity (e.g. a chat widget rendered alongside a fullscreen view). useChat({ id }) shares the Chat instance across hooks with identical ids, so if hook A clears and hook B mounts afterwards, hook B sees chatMessages.length === 0 with its own fresh seededInitialMessagesKeyRef === null and will run the seed branch, undoing the clear. The seeded ref is per-hook, not per-identity.

Resolving that would require lifting the "seeded" record to module-level state keyed by identity. Given how niche the co-mounted-duplicate-hook pattern is, and that it interacts with the already-module-level requestCache in subtle ways (re-mounting should arguably re-fetch), I'd prefer to land the three fixes in this PR and address the multi-instance case separately if it surfaces as a real report.

@threepointone threepointone merged commit ea229b1 into main Apr 23, 2026
3 checks passed
@threepointone threepointone deleted the fix/ai-chat-pending-http-url branch April 23, 2026 02:55
@github-actions github-actions Bot mentioned this pull request Apr 23, 2026
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

useAgent getHttpUrl() returns empty string before WebSocket connects, causing ai-chat to crash

1 participant