-
-
Notifications
You must be signed in to change notification settings - Fork 10.8k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Support stream response in browser #479
Comments
I was more hoping for a Node.js readable stream but I guess a whatwg stream would be better than nothing :) I ended up using https://github.com/kumavis/xhr-stream but yeah, it's a hacky implementation. |
Could you please elaborate on the problem you are solving using |
I have a long lived http response that emits new line delimited JSON objects, and want to start display the result in the browser without waiting for the response to finish. |
Why was this closed? Actually, XHR does support chunked reading, through readyState 3 rather than 4. |
You closed 505 as a duplicate of this issue, and then closed this issue. Neither has been resolved. This issue should be re-opened. |
any news about this ? |
What about react-native? |
Any news on this? |
any progress on this |
Would anybody provide a walkaround for it? I think this issue is still open because no native stream is supported in browser. If we do not want to implement socket.io, maybe SSE is the only alternative left. |
@gwh-cpnet export const ie11XHRStreamHandler = async (url: string, callback: any) => {
const xhr = new XMLHttpRequest();
xhr.open('GET', url, true);
xhr.responseType = 'stream' as any; // IE11 specific. This prevents msCaching
let seenChars = 0;
const p = new Promise((resolve, reject) => {
xhr.onreadystatechange = () => {
if (xhr.readyState === xhr.LOADING) {
callback(xhr.response.substr(seenChars));
seenChars = xhr.responseText.length;
} else if (xhr.readyState === xhr.DONE) {
if (xhr.status === 200) {
resolve();
} else {
reject(`XHR Failed with status ${xhr.status}: ${xhr.statusText}`);
}
}
};
});
xhr.send();
return p;
} |
Ended up using https://github.com/eBay/jsonpipe, maybe someone find this useful. |
try this: 3、parse text to json or replace with regular axios.post('url', params, {
cancelToken: this.source.token,
responseType: 'blob'
})
.then(response => {
response.text().then(text => {
const reg = /:\s*(\d{15,25})(\s*\}|,?)/
const textdata = text.replace(reg, ':"$1"$2')
let res = JSON.parse(textdata)
})
}) |
So what is the right type of response data when responseType set 'stream'? |
Any update on this? |
Using The advantage of |
@fengerzh fetch function can achieve this. here is an example https://gist.github.com/blackbing/22ed6db703727fb050a071ce2911684f#file-fetchstream-js |
There is an issue with fetch, however... It doesn't support streaming uploads. |
I got the same issue when using Axios with Electron. I found a solution to this. With Axios 1.4.0. I config adapter axios({ |
The http adapter is not available in browsers. |
[Axios doesn't appear to support streaming in the browser](axios/axios#479), and since we want AI.JSX to be able to run in browser, that's a no-go for us. [Loom showing streaming in the browser](https://www.loom.com/share/5bf174e2217b433fbf7901ece310774f) We still don't stream the UI demos (e.g. recipe builder) because filling in UI pieces bit-by-bit could be worse than buffering.
The issue should be reopened since axios still doesn't support stream response. Correct me if I am wrong, hope to hear your expertise. |
BUMP .-. |
I couldn't make the axios({
// ...options,
onDownloadProgress: (evt) => {
// Parse response from evt.event.target.responseText || evt.event.target.response
// The target holds the accumulator + the current response, so basically everything from the beginning on each response
// Note that it's evt.target instead of evt.event.target for older axios versions
}
}) |
Thanks for this @AkdM . But do you know of a way to get only the new responses instead of the accumulated response since the beginning ? |
This is the same approach I took as well for collecting chunked responses from a stream. Though the evt param in the onDownloadProgress callback was creating some issues for me in TS. It seems the type that was added is declared as ProgressEvent. With that being the case, I had to cast evt.target(or evt.event.target) to an XMLHttpRequest before I could access the .response property, otherwise I get a Property 'response' does not exist on type 'EventTarget'. error. onProgress(e: ProgressEvent): void {
const req = e.target as XMLHttpRequest;
this.textChunk = req.response;
}
async function streamedResponseHandler(
cb: (e: ProgressEvent) => void
): Promise<void> {
axios.get(
"/endpoint",
{
onDownloadProgress: cb,
}
);
} This callback approach was good for my use case in Vue so I could retain the reactivity of Then just calling it: streamedResponseHandler(onProgress); |
Interesting, thanks @jonah-butler. I am going to make a migration to TS very soon so that will help me. |
Thanks for your example too @AkdM. That was very helpful for me. |
Hi, did you find any build-in elegant solution? |
@ZulluBalti I know this isn't a built-in, but in my example based on @AkdM, slicing the response at the previous length of onProgress(e: ProgressEvent): void {
const req = e.target as XMLHttpRequest;
console.log(req.response.slice(this.scriptChunk.length); // <---- only new data from each new response
this.textChunk = req.response;
} |
Hi, I'm facing one problem with this approach. Axios buffers the chunks and return multiple chunks. for example, in the backend I've this code
I was expecting to get the chunks like this
But I'm getting it like this |
I guess this reply is over 9 years old. ^ XHR + Streaming text: var xhr = new XMLHttpRequest();
xhr.open('GET', '/endpoint', true);
xhr.timeout = 10000;
xhr.ontimeout = function() {
console.error('Request timed out!');
};
xhr.onprogress = function() {
var responseText = xhr.responseText;
var chunk = responseText.slice(xhr.prevLen);
xhr.prevLen = responseText.length;
console.log(chunk);
};
xhr.onload = function() {
console.log('Done!');
};
xhr.send(); Bonus: Using fetch const response = await fetch("/endpoint", {
signal: AbortSignal.timeout(10000),
});
const reader = response.body.getReader();
while (true) {
const { done, value } = await reader.read();
if (done) {
console.log("Stream complete");
break;
}
console.log(new TextDecoder().decode(value));
} Axios:
Next.js 14 endpoint: import { NextApiResponse } from 'next'
// https://developer.mozilla.org/docs/Web/API/ReadableStream#convert_async_iterator_to_stream
function iteratorToStream(iterator: any) {
return new ReadableStream({
async pull(controller) {
const { value, done } = await iterator.next()
if (done) {
controller.close()
} else {
controller.enqueue(value)
}
},
})
}
function sleep(time: number) {
return new Promise((resolve) => {
setTimeout(resolve, time)
})
}
const encoder = new TextEncoder()
async function* makeIterator() {
yield encoder.encode('<p>One</p>')
await sleep(700)
yield encoder.encode('<p>Two</p>')
await sleep(700)
yield encoder.encode('<p>Three</p>')
}
export default async function handler(req, res: NextApiResponse<any>) {
const iterator = makeIterator()
const stream = iteratorToStream(iterator)
return new Response(stream)
}
export const runtime = 'edge' |
why close |
Would the maintainer kindly hand maintenance over to someone willing to actually maintain the package? Fetch is garbage. The API is inferior, and it doesn't support streaming uploads at all. |
fetch solved my problem。because it needs to be compatible with axios-interceptors, I have to implemented axios-like function with fetch , maybe I should use fetch from the beginning |
This should be reopened LOL. Shocking. |
@brunolm I don't think that the fetch version works. It is waiting for the entire response before starting to print |
Did you set this?
Maybe it's your next version or whatever backend you're using that needs some config. |
Very shocking. |
Would be useful if
responseType
could be set tostream
in the browser. Right now, it only works in node.The text was updated successfully, but these errors were encountered: