You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
I have a web app which consumes gzipped data. On large datasets, the decompressed size can be a gigabyte or more. I would like to minimize memory usage by never materializing the fully decompressed buffer into memory; instead, I would like to decompress in chunks and then process the decompressed chunks.
It seems that I cannot use DecompressionStream to achieve this at the moment. As specced, when the compressed data comes from the network, the browser has to decompress it all and enqueue the decompressed chunks.
I would prefer to have the browser only hold on to the compressed memory until I ask for the next decompressed chunk.
The text was updated successfully, but these errors were encountered:
As specced, when the compressed data comes from the network, the browser has to decompress it all and enqueue the decompressed chunks.
Can you show me how you are using the stream? TransformStreams are not supposed to consume greedily and has highWaterMark=1, meaning it's only allowed to have one single unconsumed chunk and then stops processing until it's read.
With the following demo it only reads the first chunk and wait.
// A string "HELLO"letbuffer=[newUint8Array([31,139,8,0,0]),newUint8Array([0,0,0,0,10]),newUint8Array([243,112,245,241,241]),newUint8Array([7,0,54,100,68,193,5,0,0,0]),];letindex=0;letr=newReadableStream({pull(controller){if(index>=buffer.length){controller.close();return;}console.log("pulled",index);controller.enqueue(buffer[index]);++index;},type: "bytes"});letd=newDecompressionStream("gzip");letpipe=r.pipeThrough(d);
What is the issue with the Compression Standard?
I have a web app which consumes gzipped data. On large datasets, the decompressed size can be a gigabyte or more. I would like to minimize memory usage by never materializing the fully decompressed buffer into memory; instead, I would like to decompress in chunks and then process the decompressed chunks.
It seems that I cannot use DecompressionStream to achieve this at the moment. As specced, when the compressed data comes from the network, the browser has to decompress it all and enqueue the decompressed chunks.
I would prefer to have the browser only hold on to the compressed memory until I ask for the next decompressed chunk.
The text was updated successfully, but these errors were encountered: