Skip to content

Conversation

@TooTallNate
Copy link
Member

@TooTallNate TooTallNate commented Nov 20, 2025

We were always sending start and finish chunk more than necessary. This fixes it

@vercel
Copy link
Contributor

vercel bot commented Nov 20, 2025

The latest updates on your projects. Learn more about Vercel for GitHub.

Project Deployment Preview Comments Updated (UTC)
example-nextjs-workflow-turbopack Ready Ready Preview Comment Nov 20, 2025 10:18pm
example-nextjs-workflow-webpack Ready Ready Preview Comment Nov 20, 2025 10:18pm
example-workflow Ready Ready Preview Comment Nov 20, 2025 10:18pm
workbench-express-workflow Ready Ready Preview Comment Nov 20, 2025 10:18pm
workbench-hono-workflow Ready Ready Preview Comment Nov 20, 2025 10:18pm
workbench-nitro-workflow Ready Ready Preview Comment Nov 20, 2025 10:18pm
workbench-nuxt-workflow Ready Ready Preview Comment Nov 20, 2025 10:18pm
workbench-sveltekit-workflow Ready Ready Preview Comment Nov 20, 2025 10:18pm
workbench-vite-workflow Ready Ready Preview Comment Nov 20, 2025 10:18pm
workflow-docs Ready Ready Preview Comment Nov 20, 2025 10:18pm

@changeset-bot
Copy link

changeset-bot bot commented Nov 20, 2025

🦋 Changeset detected

Latest commit: 7fda66f

The changes in this PR will be included in the next version bump.

This PR includes changesets to release 1 package
Name Type
@workflow/ai Patch

Not sure what this means? Click here to learn what changesets are.

Click here if you're a maintainer who wants to add another changeset to this PR

Copy link
Contributor

@vercel vercel bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Additional Suggestion:

The removal of the finish message chunk breaks downstream consumers that depend on this message to determine when a stream has completed. This will cause unnecessary reconnection attempts and incorrect behavior in chat applications.

View Details
📝 Patch Details
diff --git a/packages/ai/src/agent/do-stream-step.ts b/packages/ai/src/agent/do-stream-step.ts
index ba04f07..2aa33a5 100644
--- a/packages/ai/src/agent/do-stream-step.ts
+++ b/packages/ai/src/agent/do-stream-step.ts
@@ -73,6 +73,9 @@ export async function doStreamStep(
           controller.enqueue({
             type: 'finish-step',
           });
+          controller.enqueue({
+            type: 'finish',
+          });
         },
         transform: async (part, controller) => {
           const partType = part.type;

Analysis

Missing 'finish' chunk in doStreamStep breaks WorkflowChatTransport stream completion detection

What fails: doStreamStep() in packages/ai/src/agent/do-stream-step.ts removed the {type: 'finish'} chunk from its flush() method, causing downstream consumers like WorkflowChatTransport to incorrectly trigger reconnection logic when the stream completes normally.

How to reproduce:

  1. Create a DurableAgent and call agent.stream() which uses doStreamStep()
  2. Wire the output stream to a backend HTTP response handler that serializes it
  3. Use WorkflowChatTransport on the client to consume the response stream
  4. Expected: Stream completes normally with onChatEnd callback
  5. Actual: Without 'finish' chunk, the gotFinish flag is never set, causing reconnectToStream() to be called and onChatEnd callback never fires

Code evidence:

  • doStreamStep() transforms LLM stream parts to UIMessageChunks via a TransformStream
  • The flush() method at packages/ai/src/agent/do-stream-step.ts:72-76 was modified to only emit 'finish-step'
  • WorkflowChatTransport.sendMessagesIterator() at packages/ai/src/workflow-chat-transport.ts:232-241 explicitly checks for chunk.value.type === 'finish' to set gotFinish = true
  • Without this check passing, line 248 triggers reconnection logic instead of completing
  • Per AI SDK stream protocol documentation, the stream protocol includes a final 'finish' chunk

Verification: Created test scenario showing stream without 'finish' chunk triggers 2 fetch calls (POST + reconnection GET), while stream with 'finish' chunk triggers only 1 call (POST only)

Fix applied: Restored the controller.enqueue({type: 'finish'}) call to the flush() method in the TransformStream, ensuring the 'finish' chunk is always emitted when the stream ends.

Fix on Vercel

@pranaygp pranaygp marked this pull request as ready for review November 20, 2025 22:16
@pranaygp pranaygp merged commit 43a3f79 into main Nov 20, 2025
5 of 15 checks passed
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

3 participants