Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Memory leak for async gzip #76

Closed
BlueHotDog opened this issue Jun 18, 2021 · 5 comments
Closed

Memory leak for async gzip #76

BlueHotDog opened this issue Jun 18, 2021 · 5 comments

Comments

@BlueHotDog
Copy link

Uploading multiple files from a worker, creates a memory leak and eventually throws the following error:
"Cannot perform Construct on a detached ArrayBuffer"
Looking at the memory snapshot between two runs it seems like the created array buffers are not being cleaned after running the compress method

@101arrowz
Copy link
Owner

If there's a memory leak, I would think because you are holding on to the references to the stream outputs for too long. After processing a chunk, the async streams drop all references to it, as far as I can tell, but I'll investigate as well.

Also, the second error looks more like an attempt to push the same Uint8Array to AsyncGzip that you use yourself. The docs mention that all pushed chunks are consumed in async streams (they become inaccessible after pushing). If you try to read from some data you've already pushed you may get this error. If it's actually from fflate, please send the full stack trace.

@BlueHotDog
Copy link
Author

Unfortunately the error is coming from an extension and is minified so hard to procude a readable trace.
Also still struggling to reproduce locally,
But let me try and give some more context that might help:

  • We're compressing a lot of files(hundreds) from the extension.
  • We're doing it from the extension background page.
  • We're using Rescript, so might be a bit hard to trace down the JS version of that. but from going through the code, the only place where the buffer is being re-used is when an error happened in asyncGzip

Basically the function we've is:
try gzipping, if it succeeds, return the gzipped buffer.
if any error happens(try/catch) return the original buffer

@BlueHotDog
Copy link
Author

I suspect a memory leak since when i'm doing two runs and doing a memory compare it seems like a lot of retained buffers are still there.

@101arrowz
Copy link
Owner

Still looking into this issue, maybe it applies to only a specific set of environments. What browser/engine are you using?

@BlueHotDog
Copy link
Author

Hey, we're still investigating on our end. might be an issue on our end.
Closing for now, will re-open once i've more information if applicable.
Thank you so much for the quick response! and sorry for bothering.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants