Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

Already on GitHub? Sign in to your account

running out of memory on larger uploads (nginx+gunicorn) #93

Closed
HG00 opened this Issue Jun 19, 2013 · 2 comments

Comments

Projects
None yet
2 participants

HG00 commented Jun 19, 2013

Hi

(sorry for the non-specific details)

I'm seeing MemoryErrors when uploading large files. Nginx creates a temp file just fine, but as soon as its done uploading, I can see python's memory usage increase a lot - usually ending with a MemoryError. For some files, I can see the spike in memory while the file is in the "clipboard". As soon as its moved to a folder, the memory is cleared.

Is there a way to force the use of temp files instead of memory for file uploads, or even bypass the clipboard?

Thanks,

Owner

stefanfoulis commented Jun 19, 2013

Does the memory consumption stay, even over multiple requests? Or does it only happen while a request is running?

Probably the spike happens the first time the file is saved. Maybe because filer tries to read some metadata (filesize, sha1 hash, image related stuff). Maybe the sha1 generation (https://github.com/stefanfoulis/django-filer/blob/develop/filer/models/filemodels.py#L155) is reading the whole file into memory.

Other than that filer should only read the file into memory if a Storage backend is used that needs this (e.g Amazon S3). The default FileSystemStorage should not read the whole file into memory.

Maybe you can isolate a specific part of the code that is causing the memory spike?

Owner

stefanfoulis commented Mar 7, 2016

closing for now. If this still is an issue, please re-open on the django-filer issue tracker.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment