-
Notifications
You must be signed in to change notification settings - Fork 210
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Big file uploads and "Out of memory!" #1129
Comments
I can confirm that I, also, get an "Out of memory!" error with the test above, on a box with 4GB RAM total, just under 3GB currently free. |
Having looked into this a bit more, it looks like the problem is in I'm not sure yet how much work would be involved in avoiding this, or even how feasible it is without breaking a lot of things, but I have a feeling it would be a very big undertaking. |
I'm thinking of attempting to remove the unparsed body from This is a fairly big change, but if the above idea works, the risk of breakage for end users should be fairly limited, and it would make our file upload handling much saner - I'd be willing to consider storing the entire request body (including file uploads) in RAM unnecessarily a bug worth fixing. So - I'm looking in to a fix for this. |
Really great to see progress on this! |
Right, I've a few test failures within the test suite to fix, but my changes mostly seem to be working, and do significantly reduce the memory usage of a Dancer process handling a large file upload. The output below is two runs of a simple test script which creates and uploads a 256MB file based on the code in this issue, with The resident sizes at end of execution show the difference - CPAN version: 2,260,084 bytes, my version 510,056 bytes.
|
And more tellingly, if I increase it to a 512MB file, on this 4GB RAM machine testing with the CPAN version explodes with an OOM error, whereas my updated version completes the test successfully, albeit using 1,010,056 RSS. |
This issue should have been closed when the PR was merged - closing it now :) |
I've noticed that Dancer severely eats up resources when it processes large file uploads, and even sometimes exits with "Out of memory!". What is going on in Dancer's file uploads? Isn't it processing uploads chunk-wise, as per common-sense/best practice?
Can somebody try/reproduce this, please? (...although it's probably me doing something wrong):
Clone the Dancer-devel branch, edit file t/02_request/14_uploads.t and append this as test no. 23:
I'm on a machine with 2,7 GiB memory.
The text was updated successfully, but these errors were encountered: