Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

added streaming support #5

Closed
wants to merge 4 commits into
base: master
from

Conversation

Projects
None yet
2 participants
@marconi
Copy link

marconi commented Mar 2, 2019

No description provided.

marconi added some commits Mar 2, 2019

@marconi marconi force-pushed the marconi:streaming-support branch from d5803f2 to 56fa967 Mar 2, 2019

@mauricedb
Copy link
Owner

mauricedb left a comment

Hi @marconi,

Cool contribution. I am familiar with streaming but have not actually used the API so was wondering about a few things here. So please excuse dumb comments 😸

Do you know of a public API endpoint I can use to test this?

Maurice

@@ -7,25 +7,29 @@ import {
SetStateAction
} from 'react';

interface RequestInitStreaming extends RequestInit {

This comment has been minimized.

@mauricedb

mauricedb Mar 2, 2019

Owner

The streaming option is not used until the response handling. Can't the hook detect a streaming response based on the HTTP response headers? I was under the impression that Transfer-Encoding: chunked together with a content-type that supports streaming would indicate this.

Note: The code now incorrectly assumes that the response is JSON. I was already planning to add a header check for JSON and return rsp.text() if it isn't JSON data.

This comment has been minimized.

@marconi

marconi Mar 2, 2019

Author

Right, I think this is actually a better way so we don't need the stream option. Will play around with it to see what other indicators can be used to infer if streaming or not. And yes for json, we can just check the content-type.

This comment has been minimized.

@mauricedb

mauricedb Mar 2, 2019

Owner

I was searching for some but can't really find them. I would say that structured formats with a clear start and end like JSON, XML should not be streamed, even with Transfer-Encoding: chunked as a partial file is invalid. Formats like text, YAML, binary could be streamed. But I guess even then the client has to know when the last chunk has arrived and the response is complete.

This comment has been minimized.

@marconi

marconi Mar 2, 2019

Author

Agreed, streaming is dictated by the server. But even then, there's actually no reliable way to detect it atm as Transfer-Encoding header isn't even available on cors request. See https://developers.google.com/web/updates/2015/03/introduction-to-fetch#response_types.

If somehow you have access to the server, you can expose the Transfer-Encoding header via https://developer.mozilla.org/en-US/docs/Web/HTTP/CORS#Access-Control-Expose-Headers but that's still unreliable since you're talking to different service, and don't even have access to it to be able to expose some headers.

We can't even call .text() or .json() even if content-type says so because it could also be text or json but still streaming in which case .text() or .json() will never resolve. A more reliable way is to just return the body promise and let the caller decide what they want. Then if streaming flag is passed, we do our chunk reading and return collected chunks as they get aggregated. Thoughts?

This comment has been minimized.

@mauricedb

mauricedb Mar 2, 2019

Owner

It seems strange if a server allows for CORS calls, set the Transfer-Encoding: chunked header but then doesn't set the Access-Control-Expose-Headers for the client to use. But then strange things happen 😞

I would suggest using streaming makes sense if the client can read the Transfer-Encoding: chunked header and the Content-Type makes sense for streaming. If not use the .json() or .text() depending on the Content-Type.

This comment has been minimized.

@marconi

marconi Mar 2, 2019

Author

So just to be clear, we can't use both Transfer-Encoding and Content-Type to decide whether to stream or not. First, Transfer-Encoding isn't exposed to us and second, Content-Type has nothing to do with streaming or not since it could be text/plain but still streaming so we can't rely on that either.

But we could do:

  1. pass option { streaming: true } - return aggregated chunks as ArrayBuffer
  2. no option - return rsp.body promise and let caller decide to call .json() or .text() or however they want to handle the data
Show resolved Hide resolved src/index.ts Outdated
@marconi

This comment has been minimized.

Copy link
Author

marconi commented Mar 2, 2019

Do you know of a public API endpoint I can use to test this?

Here's a Go server that just infinitely streams chunks, https://gist.github.com/marconi/fbd85020678a0cd9cf973d7bf79e30c6.

marconi added some commits Mar 4, 2019

@marconi

This comment has been minimized.

Copy link
Author

marconi commented Mar 4, 2019

Updated so it now returns the following:

  1. with {streaming: true} -> returns ArrayBuffer of concatenated chunks
  2. without {streaming: true} -> returns Response so caller can decide to either call .json() or .text() or even go the reader (.body) if they need to.
@mauricedb

This comment has been minimized.

Copy link
Owner

mauricedb commented Mar 5, 2019

Sorry for the delay, a really busy week for me.

Not really happy with this approach. By exposing the response object the developer using this needs to call response.json() in most cases and as that return a promise that is not a simple call you can do when rendering a React component. It also means not being able to return a typed response anymore, another thing I don't like.

I have not found a public facing server that uses chucked encoding and serves other data formats that JSON. With JSON data you need to wait for the complete result in order to parse it. So the use cases seem to be very limited.

With a simple test server using chunking and a text/plain mime format works well enough and if I add the Access-Control-Expose-Headers: * header the fetch client can read the Transfer-Encoding: chunked value just fine and decide on the appropriate action and pass on the individual chunks.

So to summarize I am not going to merge this PR.

While I like the idea of supporting chunked transfers the is a very limited use case and I am not willing to sacrifice the simple API for the main use case. I will accept a PR to add chunking if the main API stays simple and it is only exposes the Int8Array and multiple updates is the server indicates that it is chunking using Transfer-Encoding: chunked and the Content-Type makes sense to chunk.

@marconi

This comment has been minimized.

Copy link
Author

marconi commented Mar 5, 2019

Totally understand and no worries, atm its just not always possible to have access to the server that you can set the Access-Control-Expose-Headers: * which means you can't access Transfer-Encoding: chunked. So for now I added this https://www.npmjs.com/package/use-abortable-stream-fetch strictly for streaming purpose.

@marconi marconi closed this Mar 5, 2019

@mauricedb

This comment has been minimized.

Copy link
Owner

mauricedb commented Mar 5, 2019

👍

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
You can’t perform that action at this time.
You signed in with another tab or window. Reload to refresh your session. You signed out in another tab or window. Reload to refresh your session.