New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Large Dictionaries Not Released From Memory #1202
Comments
There's still a reference to the memory, Try running the profiler in the |
I'm watching the process in htop, and it's definitely not being garbage collected after it runs. |
This is not a Flask issue, if it's actually an issue it's an issue with your interpreter. As @davidism already described, the way you are using memory_profiler is wrong. Watching what happens in htop doesn't really say anything about the garbage collector. It will tell you when memory has been freed and you can thereby assume that the garbage collector collected some objects but just because memory isn't freed, doesn't necessarily mean that the garbage collector didn't collect an object. Both CPython itself as well as the malloc implementation on your platform will cache allocations, so the process will continue consuming memory even though it's unused. CPython further specifically caches ints. Based on that what should happen is that the interpreter makes a lot of allocations to create the dict and after the response is generated, it should collect the object and then some of that memory will be cached, some of it will be freed, some of the freed memory may be cached by malloc and some of it will actually be released to the operating system. This is exactly what I see happening with htop. Memory rises to about 4600M of virtual memory, that drops down to about 3200M after the response has been sent. If I make a second request, the memory rises again to 4600M and drops back to about 3200M. If there would be a leak, memory consumption should increase with each request. |
Thanks for the help @DasIch and @davidism I realize that calling it a leak is incorrect now, also thanks for the advice on the correct way to use the memory_profiler. Here's my updated example with the solution: https://gist.github.com/pawl/95769724848269cff890 Running it as a separate process seems like the best way to make sure the OS releases the memory. |
Not related to your issue, but you should really use 4 spaces for indentation instead of hard tabs. |
Thanks for the tip @ThiefMaster By the way, I learned some more about what caused this issue. It only happens on Linux: http://www.paulsprogrammingnotes.com/2014/10/large-dictionaries-not-released-from.html |
Example: https://gist.github.com/pawl/8067c988b1cbfd48b855
I'm using Flask==0.10.1 and python 2.7.4.
Is there a way to release the large dictionary from memory after the return?
The text was updated successfully, but these errors were encountered: