-
-
Notifications
You must be signed in to change notification settings - Fork 34.1k
Closed as not planned
Description
Version
24.0.2
Platform
Darwin Toms-MacBook-Air.local 24.5.0 Darwin Kernel Version 24.5.0: Tue Apr 22 19:54:26 PDT 2025; root:xnu-11417.121.6~2/RELEASE_ARM64_T8112 arm64 arm Darwin
Subsystem
No response
What steps will reproduce the bug?
- Create a new project with
npx create-mastra@latestand select thetoolsand defaultexamples. - Run
npm run dev - Go to
localhost:4111and click on Agents ->Weather Agent - In the chat, click on
Model Settingsand make surestreamis set to on. - Run the
Weather Agentand ask for the weather in New York. - See that it returns in 1 block instead of streaming.
How often does it reproduce? Is there a required condition?
100% reproducible.
Switching to Node 23 resolves this issue. However, the nuance here is that the node versions used are installed through devbox (Nix).
What is the expected behavior? Why is that the expected behavior?
The expected behavior is that when a POST request is made to an endpoint that returns partial chunks, it should stream those chunks to the client as opposed to awaiting for every chunk to finish altogether before sending it to the client.
try {
const response = await fetch('/api/agents/' + agentId + '/stream', {
method: 'POST',
headers: { 'Content-Type': 'application/json' },
body: JSON.stringify({
messages: [userMessage],
runId: agentId,
...(memory ? { threadId, resourceid: agentId } : {}),
}),
});
if (!response.body) {
throw new Error('No response body');
}
if (response.status !== 200) {
const error = await response.json();
throw new Error(error.message);
}
mutate(`/api/memory/threads?resourceid=${agentId}&agentId=${agentId}`);
const reader = response.body.getReader();
const decoder = new TextDecoder();
let buffer = '';
let assistantMessage = '';
let errorMessage = '';
try {
while (true) {
const { done, value } = await reader.read();
if (done) break;
const chunk = decoder.decode(value);
buffer += chunk;
const matches = buffer.matchAll(/0:"((?:\\.|(?!").)*?)"/g);
const errorMatches = buffer.matchAll(/3:"((?:\\.|(?!").)*?)"/g);
if (errorMatches) {
for (const match of errorMatches) {
const content = match[1];
errorMessage += content;
setMessages(prev => [
...prev.slice(0, -1),
{ ...prev[prev.length - 1], content: errorMessage, isError: true },
]);
}
}
for (const match of matches) {
const content = match[1].replace(/\\"/g, '"');
assistantMessage += content;
setMessages(prev => [...prev.slice(0, -1), { ...prev[prev.length - 1], content: assistantMessage }]);
}
buffer = '';
}
} catch (error: any) {
throw new Error(error.message);
} finally {
reader.releaseLock();
}
} catch (error: any) {
setMessages(prev => [
...prev.slice(0, -1),
{
...prev[prev.length - 1],
content: error?.message || `An error occurred while processing your request.`,
isError: true,
},
]);
} finally {
setIsLoading(false);
}
};What do you see instead?
Screen.Recording.2025-05-25.at.12.26.01.mov
Additional information
This is what it is on Node 23:
Screen.Recording.2025-05-25.at.14.53.40.mov
Metadata
Metadata
Assignees
Labels
No labels