Closed
Description
Description
I tried a lot of settings, and I got to the conclusion that the issue I have is similar to #5493. I tried to raise assistant-ui/assistant-ui#1901, but it was from the Vercel AI SDK in the end.
Pretty weird, if I have a front-end execution of that particular tool, the streaming gets blocked while the chunk is placed after the tool result gets resolved.
So, I have this backend API, where experimental_continueSteps: true
:
return streamText({
model,
toolCallStreaming: true, // <-- enabled
tools: allTools,
messages: coreMessages,
abortSignal: req.signal,
maxSteps: 10,
experimental_continueSteps: true, // <-- this is causing the issue
}).toDataStreamResponse()
The responses are consistently looking like this. Notice the last chunk at the end of the streaming.
f:{"messageId":"msg-ufeQqIOMAnmDfh6MKjUXrJ2G"}
0:"I'll help you "
0:"fix this nasty bug. Let's start"
0:"by checking the current project "
b:{"toolCallId":"toolu_01Wz9D4DPgVfhZ2JwQFWbJja","toolName":"read_files"}
c:{"toolCallId":"toolu_01Wz9D4DPgVfhZ2JwQFWbJja","argsTextDelta":""}
[...]
9:{"toolCallId":"toolu_01Wz9D4DPgVfhZ2JwQFWbJja","toolName":"read_files","args":{"path":"src/app/"}}
0:"structure." # <--- This nasty thing.
If experimental_continueSteps
gets disabled, it turns out to be working as expected:
f:{"messageId":"msg-2gYayiQBcVJ63seM78koxL2b"}
0:"I'll help"
0:" you solve this bug. Let's start"
0:" by reading the existing files to unders"
0:"tand the context."
b:{"toolCallId":"toolu_019dAv2H51d35MynE7DViaG4","toolName":"read_files"}
c:{"toolCallId":"toolu_019dAv2H51d35MynE7DViaG4","argsTextDelta":""}
[...]