Replies: 2 comments 9 replies
-
I see you listed If so, you need to use Route Handlers. API Routes (from the Pages Router) do not support streaming responses on Vercel. |
Beta Was this translation helpful? Give feedback.
-
@drorIvry I can confirm that streaming a periodic "keep alive" message every 15 seconds ended up resolving the (seemingly 35 second) timeout: export async function POST(req: Request): Promise<Response> {
const json = await req.json()
...
const userId = (await auth())?.user.id
if (!userId) {
return new Response('Unauthorized', {
status: 401
})
}
try {
const response = await fetch(`.../stream`, {
method: 'POST',
headers: {
'Content-Type': 'application/json',
},
body: JSON.stringify({...})
})
if (!response.body || !response.ok) {
console.error(response)
return new Response('API error', {
status: 500
})
}
const encoder = new TextEncoder()
const decoder = new TextDecoder()
let completion = ''
const stream = new ReadableStream({
async start(controller) {
let streamEnded = false;
async function keepAlive() {
while (!streamEnded) {
const space = '\u200B'; // Zero Width Space
const queue = encoder.encode(space);
controller.enqueue(queue);
await new Promise(resolve => setTimeout(resolve, 15_000));
}
}
async function onParse(event: ParsedEvent | ReconnectInterval): Promise<void> {
if (event.type === "event") {
const data = event.data;
if (event.event === "stream_start"){
console.log(`Stream has started`)
const output_value = '...';
const queue = encoder.encode(output_value);
completion += output_value;
controller.enqueue(queue);
} else if (event.event === "stream_end" || data === "[DONE]"){
console.log(`Stream has ended`)
streamEnded = true;
const output_value = '...';
const queue = encoder.encode(output_value);
completion += output_value;
controller.enqueue(queue);
await onCompletion(...);
controller.close();
return;
} else {
...
}
}
}
keepAlive();
// stream response (SSE) may be fragmented into multiple chunks
// this ensures we properly read chunks & invoke an event for each SSE event stream
const parser = createParser(onParse);
// https://web.dev/streams/#asynchronous-iteration
for await (const chunk of response.body as any) {
parser.feed(decoder.decode(chunk));
}
},
});
return new StreamingTextResponse(stream)
} catch (error) {
console.error(error)
return new Response('API error', {
status: 500,
})
}
} It seems that the Vercel docs aren't clear on this matter:
Specifically, apart from sending a response within 25 seconds (which I currently do) you actually have to keep sending messages periodically (unclear at what frequency, but 15 seconds resolves the issue) to keep the stream alive.
I've tested, and you can't enqueue empty messages either - the utf-8 encoded value has to contain non-zero bytes, which is why I used a "Zero Width Space" so it doesn't appear on the client side. Instead, it appears as |
Beta Was this translation helpful? Give feedback.
-
Summary
Hi, I'm trying to run an edge function that has a long (45+) seconds wait time (in which no data is being streamed)
for some reason the stream is cut after 35 seconds and the response won't be streamed
Here's a simplified example
Given this route
/pages/api/test.ts
the request will hang for 35 seconds and simply finish with status 200, any idea what I'm missing here?
Is there a better way to do what I'm trying to do?
Example
No response
Steps to Reproduce
Beta Was this translation helpful? Give feedback.
All reactions