Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Browser crash on large file decompression #12

Open
tannerlinsley opened this issue Jul 29, 2014 · 4 comments
Open

Browser crash on large file decompression #12

tannerlinsley opened this issue Jul 29, 2014 · 4 comments

Comments

@tannerlinsley
Copy link

In Chrome, I was able to compress a 34mb json file (with some stringification) down to 5MB...

That's friggin' awesome!!!

But, upon trying to decompress the same, it crashes around 60% every time. I'm pretty sure it's getting put on a web worker, and at the moment, that worker pushes my chrome tab to about 120% CPU and 900MB of RAM. This is all on level 1 compression, too.

This may be expected behavior and I'm pushing the envelope here on size ;)
Or not?
I just wanted to get your opinion on large dataset compression.

@nmrugg
Copy link
Collaborator

nmrugg commented Jul 29, 2014

900MB is definitely pushing it. I have a feeling you're hitting this bug
in v8: https://code.google.com/p/v8/issues/detail?id=847

It was recently fixed in Canary. But even still, 900MB is pretty big.
Perhaps you'd be best off splitting the string and combining it after
decompression.

In the future, I plan implementing typed arrays. Perhaps that would help
keep the memory usage down. I'm not sure.

@tannerlinsley
Copy link
Author

I'll run some more tests and see if I can get some solid correlations on memory usage to crashing. If so, I'll try splitting it up and let you know how it goes. Oh, and typed arrays would be awesome, I'm sure. I'm currently storing the compressed data using PouchDB, and the system works great.

@raphink
Copy link

raphink commented Jun 25, 2015

I'm hitting this bug with a file much smaller than 900MB.

My uncompressed data is 66MB, and I hit the bug whether I compress it with lzma -9 (1.5MB) or with lzma -1 (9.3MB). Chrome crashes while decompressing the data.

@nmrugg
Copy link
Collaborator

nmrugg commented Jun 25, 2015

I'm surprised that a file that small is crashing. I've been able to decompress files about that size. But it is hard to determine how v8 is using memory.

I suggest you try splitting up the data as a workaround. Hopefully we'll be able to implement a streaming api soon to make big files work.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

3 participants