-
Notifications
You must be signed in to change notification settings - Fork 8
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Streaming APIs #31
Comments
Here is a WIP spec on FetchObserver. It lists some states that would help with providing streaming states. Atm, here is what I am thinking:
Things to verify:
My prediction is that for cached requests, the memory footprint is x1. For "active" requests (requests that are cached and that you are using), the footprint is x n, where n+1 is the number of places you are using the data. There may be a way to make it so that the data that has been read once is made available everywhere, too, without wrapping methods. That would make the footprint x1 and give all of these features. |
It's been awhile, but I figured I'd post an update with a test that I ran: Chrome hides the download resource out of the JS heap size until it is read, so it is going to be pretty challenging to determine the memory consequences of cloning a response... From what I can gather, the spec doesn't say anything about where the data has to be stored, so some browsers could do this in an efficient manner, and others could do it inefficiently. |
Download size isn't the only concern here. Perhaps there's a way to save the results of these operations into the cache...is that getting too complicated? |
Gonna close and stick with the The only solution I can think of atm is to wrap the Response object, but that adds a considerable amount of complexity to this lib that I'm not convinced is worth it atm. |
I think that the only way to support streaming APIs is to clone the responses rather than trying to copy the stream over. Atm the
responseType
methods read the stream to completion before returning it, which isn't ideal.This is an unfortunate situation.☹️
Update: One idea would be to tee the stream and send that out to each consumer, but it doesn't resolve the
responseType
problem.The text was updated successfully, but these errors were encountered: