Replies: 1 comment
-
Note that we used to have pipeToNodeReadable and this is actually still in the build and even has Suspense support. However, we'll add a warning that it's deprecated. So it's not like we can't do it - but it's very intentional that we think it'll be a bad idea for almost everyone. We need to be able to flush all the way to the underlying target once you have enough data to show progressive content. Meaning once we flush something it should go out directly to the user. Like if we've completed the shell but is missing data for one section we can send the shell. Transforms often buffer content up until some level. This means that if you have a transform after React, it might be holding onto that shell a bit. Even if it sends a few of the bytes, if it doesn't send the script tag that displays it, it won't show up. Then once we get more data to complete the render, we emit a few more bytes on the stream but then the whole content is done. The effect of this is that the user sees nothing and then at the end sees everything. So a bad intermediate transform can completely destroy the whole point of it. However, it gets worse because the format we use to send things is larger and more CPU intensive (e.g. injecting script tags instead of just inline HTML). That's only a benefit if it's actually streaming. So we rely on backpressure to tell us that it's better to wait to buffer up more content so we can generate inline HTML. Readable streams hack backpressure too but that's a one way communication. There's no way for the stream to tell the next stream that now is a good time to flush - even if your window is not full. The most common example of this is GZIP in Node. GZIP compresses in windows of some maximum byte range for best effectiveness. Therefore it waits to fill a buffer before compressing that buffer. If we fill it only half way then it'll block the bytes from getting to the user. Luckily this is a well known issue and so there exists APIs to solve this. To avoid this we call It's not just GZIP neither because many transforms and naive code don't consider this use case so they likely won't work out of the box anyway. This feature only exists on writable streams and there's nothing in the readable protocol that forwards this. Similarly if you use async generators to process this data you'd naively also loose this signal. Basically those APIs just doesn't work. Web Streams doesn't actually have anyway to do this so are kind of fundamentally flawed for this use case atm. That's why we accept the status quo and provide a Readable in for the Web version of this API. But once there is an idiomatic solution for this, we'd switch to that API - even if that means using a Writable instead. However, Web Streams (e.g. for use in CloudFlare Workers and Deno) seems fundamentally flawed atm - at least if you need to apply compression. None of this applies to transitional SSG though since you probably don't want to stream as you load data but rather wait for completion before starting the stream. I'm hesitant to provide an API for this specifically though because it's so rare that you'd only do SSG and never SSR since many systems are moving into a hybrid model anyway. So it would be misleading to rely on it and build infrastructure on top of it that doesn't port to SSR, or worse, someone just using it without reading details like this post. It's a small nit but even with SSG it would be more optimal when applying the progressiveChunkSize option (for encoding large HTML pages to be streaming using script tags inserting the content). You should ideally encode the GZIP stream taking |
Beta Was this translation helpful? Give feedback.
-
I was curious on why React was using a writable stream instead of a readable stream. Readable streams play more nicely with the new pipeline functions in nodejs. Readable streams also support transform streams or async generators. Which are convenient in an SSG world.
I hacked together an example of how a readable stream could look like. It works very similar to fs.createReadStream and might be more familiar to a lot of people.
https://codesandbox.io/s/cranky-williamson-0z9nk?file=%2Fserver%2Frender.js
Beta Was this translation helpful? Give feedback.
All reactions