Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

BlockingConnection simple_publish eating memory. #749

Closed
corpulent opened this issue May 16, 2016 · 4 comments
Closed

BlockingConnection simple_publish eating memory. #749

corpulent opened this issue May 16, 2016 · 4 comments

Comments

@corpulent
Copy link

I am using pika to publish messages from a Django app, everytime I publish a message, my memory usage increases a bit, overtime the server crashes because it runs out of memory. I can't figure out what is causing this memory leak. I am using the publishing example from here https://github.com/pika/pika/blob/master/examples/publish.py. I call this script from my Django app via subprocess.Popen, I also tried wrapping into a Pyro4 object. Nothing helps, I always get the memory leak. Any thoughts?

@vitaly-krugl
Copy link
Member

vitaly-krugl commented May 16, 2016

Which OS and version of pika are you using?

If you're wrapping your entire pika session in a Popen, and then reaping the subprocess (confirming that it completed), I don't see how pika could be contributing to memory leaks, since the operating system is responsible for cleaning up memory upon process termination. Are you sure that the subprocesses are terminating in a timely manner?

It would help to isolate the code into a simpler app; for example, eliminate Django, and just implement a simple python app that publishes messages using subprocess.Popen just like you were with Django, and checking to make sure that the process completes, etc., and tracking memory usage.

Post the simple code example that reproduces the problem without Django.

@eandersson
Copy link
Contributor

Is this with the latest RabbitMQ version (3.6.2?)

@corpulent
Copy link
Author

I resolved this by stopping the logging.

@vitaly-krugl
Copy link
Member

Thanks for the update, closing...

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

3 participants