Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Limit the number of decoded images in memory #6

Open
cozarkd opened this issue Feb 21, 2024 · 3 comments
Open

Limit the number of decoded images in memory #6

cozarkd opened this issue Feb 21, 2024 · 3 comments

Comments

@cozarkd
Copy link

cozarkd commented Feb 21, 2024

Is your feature request related to a problem? Please describe.
I'm trying to optimize 500+ images but it eats all the RAM and then crashes (On WSL2, Windows). On my Mac it just gets stuck and blinking. I've found an issue in the original project with a PR.

No idea if this is hard to implement but would be nice. I can optimize in batches of around 30-50 images but it's not ideal since I need to do those 500 images multiple times.

Thanks for your fork.

@aggregate1166877
Copy link
Contributor

Hi @cozarkd, thanks for the request.

My deadlines are pretty rough at moment, so it might be a few weeks before I can look into this. I feel this issue is nonetheless important so it definitely will get looked at when I get a moment or need a break from other work.

So I can hit the ground running and reproduce the issue quickly, could you tell me the average size and format of the images you're dealing with?

@cozarkd
Copy link
Author

cozarkd commented Feb 21, 2024 via email

@aggregate1166877
Copy link
Contributor

aggregate1166877 commented Apr 30, 2024

Hi @cozarkd - I believe this issue is now fixed. Can you please test with CLI version 0.9.0 0.9.1 when you have a moment?

The application will now only process as many images as you have CPU cores. If you have a crazy amount of cores and run out of memory, I've added a -c option to reduce the amount of concurrent images processed (e.g. to limit to 8 use squoosh-cli -c8 ...).

I also noticed that the terminal output gets badly corrupted with a large amount of files. The CLI will still use the fancy output when processing less than 16 images, but anything more than that will now use a basic / boring output that works correctly with any amount of images.

Testing

I did the following tests using 16 threads, all converting png to webp:

  • 1000 images totalling 700MB (maxed out at 7.5GB RAM)
  • 300 images totalling 3GB (forgot to measure RAM use)
  • 1000 images totalling 10GB (maxed out at 23GB RAM) (maxed out at 8.1GB RAM)

All completed without error.

Edit

There was a memory leak in 0.9.0. Version 0.9.1 no longer has this issue, the application has not exceeded 8.1GB RAM use in any of my tests (though it will probably exceed this amount with images larger than 10MB each at 16+ threads).

aggregate1166877 added a commit that referenced this issue Apr 30, 2024
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants