Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

DecompressionStream should support pull-based decompression #66

Closed
mstange opened this issue Jun 18, 2024 · 2 comments
Closed

DecompressionStream should support pull-based decompression #66

mstange opened this issue Jun 18, 2024 · 2 comments

Comments

@mstange
Copy link

mstange commented Jun 18, 2024

What is the issue with the Compression Standard?

I have a web app which consumes gzipped data. On large datasets, the decompressed size can be a gigabyte or more. I would like to minimize memory usage by never materializing the fully decompressed buffer into memory; instead, I would like to decompress in chunks and then process the decompressed chunks.

It seems that I cannot use DecompressionStream to achieve this at the moment. As specced, when the compressed data comes from the network, the browser has to decompress it all and enqueue the decompressed chunks.
I would prefer to have the browser only hold on to the compressed memory until I ask for the next decompressed chunk.

@saschanaz
Copy link
Member

As specced, when the compressed data comes from the network, the browser has to decompress it all and enqueue the decompressed chunks.

Can you show me how you are using the stream? TransformStreams are not supposed to consume greedily and has highWaterMark=1, meaning it's only allowed to have one single unconsumed chunk and then stops processing until it's read.

With the following demo it only reads the first chunk and wait.

// A string "HELLO"
let buffer = [
  new Uint8Array([31,139,8,0,0]),
  new Uint8Array([0,0,0,0,10]),
  new Uint8Array([243,112,245,241,241]),
  new Uint8Array([7,0,54,100,68,193,5,0,0,0]),
];
let index = 0;
let r = new ReadableStream({
  pull(controller) {
    if (index >= buffer.length) {
      controller.close();
      return;
    }
    console.log("pulled", index);
    controller.enqueue(buffer[index]);
    ++index;
  },
  type: "bytes"
});
let d = new DecompressionStream("gzip");
let pipe = r.pipeThrough(d);

Please correct me if I misunderstood your issue.

@mstange
Copy link
Author

mstange commented Jun 18, 2024

Ah, that means I misunderstood how it works and need to do some more debugging. Thanks for taking a look!

@domenic domenic closed this as completed Jun 20, 2024
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Development

No branches or pull requests

3 participants