fix(ai-chat): strip messageId from continuation start chunks (#1229)#1235
fix(ai-chat): strip messageId from continuation start chunks (#1229)#1235threepointone merged 2 commits intomainfrom
Conversation
🦋 Changeset detectedLatest commit: e0ab802 The changes in this PR will be included in the next version bump. This PR includes changesets to release 1 package
Not sure what this means? Click here to learn what changesets are. Click here if you're a maintainer who wants to add another changeset to this PR |
agents
@cloudflare/ai-chat
@cloudflare/codemode
hono-agents
@cloudflare/shell
@cloudflare/think
@cloudflare/voice
@cloudflare/worker-bundler
commit: |
Move stripping of messageId for continuation start chunks to the server-side broadcast logic so clients reuse the existing assistant message (fixes duplicate assistant message / #1229). Remove the client-side stripping in the WebSocket transport and add a test that verifies the server strips messageId. Also add an SSE stub in the test agent to simulate streams containing messageId and update tests accordingly; update changelog entry.
e0ab802 to
675d02b
Compare
|
Rewrote the approach in this PR — here's what changed and why: Moved the fix from client-side to server-side. The original PR stripped
Removed the client-side transport fix and its test ( Removed the React integration test ( Rewrote the server-side test to actually exercise the SSE code path. The original test used
|
Summary
When a tool continuation stream starts after a client tool result, the AI SDK emits a
startchunk with a freshmessageId. On the client,processUIMessageStreamuses this to override the assistant message ID, which causes the AI SDK'swrite()to push a new message instead of replacing the existing one — resulting in a brief duplicate assistant message.The server already had a
!continuationguard on its own message-building path (skippingmessageIdfor its internalmessage.id), but was still forwarding the rawmessageIdto clients in the broadcast. This PR stripsmessageIdfromstartchunks server-side in_streamSSEReplybefore both storage and broadcast, so all clients — not justWebSocketChatTransport— are protected.Changes
packages/ai-chat/src/index.ts— In the chunk-rewriting section of_streamSSEReply, stripmessageIdfromstartchunks whencontinuationis true. This sits alongside the existingfinishReason → messageMetadatatransform. Updated the comment to document both transforms.packages/ai-chat/src/tests/worker.ts— AddedsseWithMessageIdbody flag toTestChatAgentso it can return an SSE response with astartchunk containingmessageId(simulating realstreamTextoutput).packages/ai-chat/src/tests/client-tools-continuation.test.ts— End-to-end test: triggers a tool-result auto-continuation with an SSE response, uses a second connection to passively observe broadcast chunks, and asserts that continuationstartchunks havemessageIdstripped..changeset/cool-oranges-drum.md— Patch changeset for@cloudflare/ai-chat.Closes #1229
Testing