Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Website bug: seems to not be compressing anything #47

Closed
feross opened this issue Mar 10, 2021 · 9 comments
Closed

Website bug: seems to not be compressing anything #47

feross opened this issue Mar 10, 2021 · 9 comments

Comments

@feross
Copy link

feross commented Mar 10, 2021

How to reproduce

The problem

The compressed size is unrealistic, which makes me think that the file was not processed correctly. Instead an empty string or something similar was processed.

Screen Shot 2021-03-10 at 12 19 04 PM

Browser: Latest Chrome
OS: Latest macOS

@101arrowz
Copy link
Owner

That's odd. I've tried this with multiple massive files and haven't had this problem.

Could you try the "streaming GZIP" preset? It's likely that the browser failed to load the 2.7GB of data into memory. Streaming avoids this problem.

@feross
Copy link
Author

feross commented Mar 10, 2021

The streaming example just hangs:

Screen Shot 2021-03-10 at 12 27 00 PM

@101arrowz
Copy link
Owner

Compressing 2.7GB of data can take around a minute or two, longer on slow computers. I'll run the demo on a large filestream and let you know my findings.

@feross
Copy link
Author

feross commented Mar 10, 2021

Actually, I might have misunderstood the UI -- I'll let it run and report back if it finishes.

@feross
Copy link
Author

feross commented Mar 10, 2021

It looks like the streaming example throws an exception:

Uncaught (in promise) TypeError: Failed to fetch
Promise.then (async)
eval @ VM459:12
Br @ sandbox.ts:144
onClick @ index.tsx:510
j @ preact.module.js:1

@101arrowz
Copy link
Owner

I found the issue, I'm trying to concatenate the output buffer into an ArrayBuffer, which throws because it's over 2GB, the max in browsers. Will fix by only calculating the length rather than concatenating.

@feross
Copy link
Author

feross commented Mar 10, 2021

Makes sense, thanks for the quick debugging!

@101arrowz
Copy link
Owner

101arrowz commented Mar 10, 2021

Try pasting this into the code box:

const { AsyncGzip } = fflate;
// Theoretically, you could do this on every file, but I haven't done that here
// for the sake of simplicity.
const file = files[0];
const gzipStream = new AsyncGzip({ level: 6 });
// We can stream the file through GZIP to reduce memory usage
const gz = file.stream().pipeThrough(toNativeStream(gzipStream));
let sz = 0
gz.pipeTo(new WritableStream({
  write(dat) { sz += dat.length; console.log(sz); },
  close() { callback('Length: ' + sz); }
}));

It took a while but worked for me.

@feross
Copy link
Author

feross commented Mar 10, 2021

Perfect, that worked!

@feross feross closed this as completed Mar 10, 2021
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants