-
Notifications
You must be signed in to change notification settings - Fork 1.2k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Expressjs: Allow streamToResponse to send back custom data in response body #1285
Comments
I created my own version of So add this function somewhere in your code: export function streamToResponse(
res: ReadableStream,
response: ServerResponse,
init?: { headers?: Record<string, string>; status?: number },
data?: StreamData,
) {
response.writeHead(init?.status || 200, {
'Content-Type': 'text/plain; charset=utf-8',
...init?.headers,
});
let processedStream = res;
if (data) {
processedStream = res.pipeThrough(data.stream);
}
const reader = processedStream.getReader();
function read() {
void reader
.read()
.then(({ done, value }: { done: boolean; value?: any }) => {
if (done) {
response.end();
return;
}
response.write(value);
read();
});
}
read();
} Then use this streamToResponse instead of the one provided with AI SDK. I could make a PR about this if this is something maintainers would agree on? |
Hello, i almost opened a similar PR to @valstu and @lgrammel's. My use case is adding some custom data as a "prelude" before streaming the LLM generation; for example, to add an identifier of the agent that is responding. This can be done either by adding a stream or data argument that is written to Below a simple example of both together jic it's of any use to see how we're now using it. /**
* Adds the ability to not set headers (`{ headers: null }`) and/or write a prelude before streaming.
*/
export function streamToResponse(
stream: ReadableStream,
response: ServerResponse,
{ headers, prelude, status }: { headers?: Record<string, string> | null; prelude?: string; status?: number } = {},
) {
if (headers !== null) { // new check
response.writeHead(status || 200, {
"Content-Type": "text/plain; charset=utf-8",
...headers,
});
}
if (prelude) { // allow adding a "prelude" before the stream; another stream would also be fine but wasn't necessary for us
response.write(prelude);
}
const reader = stream.getReader();
function read() {
reader.read().then(({ done, value }: { done: boolean; value?: any }) => {
if (done) {
response.end();
return;
}
response.write(value);
read();
});
}
read();
} |
@MartinCura adding a prelude can cause problems with stream parsing on the client (if you use useChat or useCompletion). Have you considered using |
|
Feature Description
I am using React ^18.2.0 and Express ^4.19.2 and openai as the ai provider. I am proposing the addition of a feature that allows the streamToResponse function to send back custom data in the response body. Currently, the streamToResponse function only supports streaming LLM response directly from the AI provider to the client with the ability to modify only the response headers and status.
However, there are use cases where it would be beneficial to include additional custom data in the response body.
This option is available in the similar function
experimental_StreamingReactResponse
where you can send custom data in the body.However this doesn't work in Nodejs servers and works only on Vercel as far as I know.
Use Case
It would be incredibly useful to piggyback on the final response and add custom data to the response body sent back to the client. For example if the function can be modified to accept an options params with the following signature:
Which in turn would be use like so:
Currently struggling to find a way around this, and I really don't want to migrate to vercel just for this one feature.
Additional context
No response
The text was updated successfully, but these errors were encountered: