Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Memory leak seen with 0.15.1 _SocketDuckForFd and GreenPipe #122

Closed
ecolnick opened this issue Aug 20, 2014 · 8 comments
Closed

Memory leak seen with 0.15.1 _SocketDuckForFd and GreenPipe #122

ecolnick opened this issue Aug 20, 2014 · 8 comments

Comments

@ecolnick
Copy link

Using eventlet version 0.15.1 in Openstack, I have found that the memory consumption of the various services have increased steadily over time without any significant activity actually taking place. Tracing the memory usage with the help from mem_top produces the following:

refs:
172560 <type 'list'> [_SocketDuckForFd:4, _SocketDuckForFd:5, _SocketDuckForFd:7, <closed GreenPipe 'fd:4', mode 'wb' a
4092 <type 'set'> set([2, 3, 4, 5, 6, 7, 8, 9, 10, 11, 12, 13, 14, 15, 16, 17, 18, 19, 20, 21, 22, 23, 24, 25, 26, 27,
2066 <type 'dict'> {'oslo.messaging.rpc.sys': None, 'sqlalchemy.engine.reflection': <module 'sqlalchemy.engine.reflecti
1547 <type 'list'> ['#!/usr/bin/env python\n', '# Copyright 2011 VMware, Inc.\n', '# All Rights Reserved.\n', '#\n', '#
666 <type 'list'> ['# Copyright 2012 Locaweb.\n', '# All Rights Reserved.\n', '#\n', '# Licensed under the Apache L
612 <type 'dict'> {'SocketType': <class 'socket._socketobject'>, 'getaddrinfo': , 'AI_N
612 <type 'dict'> {'SocketType': <class 'socket._socketobject'>, 'getaddrinfo':\ <built-in function getaddrinfo>, 'AI_N
592 <type 'dict'> {'SocketType': <class 'socket._socketobject'>, 'getaddrinfo': , 'AI_N
510 <type 'dict'> {'SocketType': <type '_socket.socket'>, 'getaddrinfo': , 'IPPROTO_RAW
510 <type 'dict'> {'SocketType': <type '_socket.socket'>, 'getaddrinfo': , 'AI_NUMERICS

Notice that the top listing shows a list of _SockDuckForFd and closed GreenPipe items. This list continues to grow in size consistently.

This behavior was not observed when running the same Openstack installation with eventlet version 0.14.0

@temoto
Copy link
Member

temoto commented Aug 21, 2014

Thank you very much for bug report. Could you try v0.15.0 ?

@ecolnick
Copy link
Author

I have tried with v0.15.0 and am not seeing the same issue.

@temoto
Copy link
Member

temoto commented Aug 21, 2014

Thank you very much. That narrows issue down to 3 commits.

@mkerrin @jan-g could you please have a look at this? It seems that "second simultaneous read" fix introduced a memory leak.

mkerrin pushed a commit to mkerrin/eventlet that referenced this issue Aug 22, 2014
This should fix or partial fix the memory leak reported in issue eventlet#122

After a long running job that highlighted a previous known issue with
Second Simultaneous Read issue I used http://mg.pov.lt/objgraph/ to print
out all the types and counts still referenced by the GC.

Before this patch I was seeing GreenSocket and _socketobject types been
referenced. Depending on how long I ran the test job for I saw more references
to these objects. After this patch I see no reference no matter how long I
run my test for.
mkerrin pushed a commit to mkerrin/eventlet that referenced this issue Aug 22, 2014
eventlet#122

This should fix or partial fix the memory leak reported in issue eventlet#122

After a long running job that highlighted a previous known issue with
Second Simultaneous Read issue I used http://mg.pov.lt/objgraph/ to print
out all the types and counts still referenced by the GC.

Before this patch I was seeing GreenSocket and _socketobject types been
referenced. Depending on how long I ran the test job for I saw more references
to these objects. After this patch I see no reference no matter how long I
run my test for.
mkerrin pushed a commit to mkerrin/eventlet that referenced this issue Aug 22, 2014
eventlet#122

This should fix or partial fix the memory leak reported in issue eventlet#122

After a long running job that highlighted a previous known issue with
Second Simultaneous Read issue I used http://mg.pov.lt/objgraph/ to print
out all the types and counts still referenced by the GC.

Before this patch I was seeing GreenSocket and _socketobject types been
referenced. Depending on how long I ran the test job for I saw more references
to these objects. After this patch I see no reference no matter how long I
run my test for.
mkerrin pushed a commit to mkerrin/eventlet that referenced this issue Aug 22, 2014
eventlet#122

After a long running job that highlighted a previous known issue with
Second Simultaneous Read issue I used http://mg.pov.lt/objgraph/ to print
out all the types and counts still referenced by the GC.

Before this patch I was seeing GreenSocket and _socketobject types been
referenced. Depending on how long I ran the test job for I saw more references
to these objects. After this patch I see no reference no matter how long I
run my test for.
@mkerrin
Copy link
Contributor

mkerrin commented Aug 22, 2014

Hi I looked into this and found one issue and put up a fix here: #125

I still want @jan-g to look at and make sure I haven't introduced some other regression

temoto pushed a commit that referenced this issue Aug 27, 2014
…close it

#122

After a long running job that highlighted a previous known issue with
Second Simultaneous Read issue I used http://mg.pov.lt/objgraph/ to print
out all the types and counts still referenced by the GC.

Before this patch I was seeing GreenSocket and _socketobject types been
referenced. Depending on how long I ran the test job for I saw more references
to these objects. After this patch I see no reference no matter how long I
run my test for.
@temoto
Copy link
Member

temoto commented Aug 28, 2014

Merged mkerrin's fix into master. @ecolnick @jan-g please test it.

@temoto
Copy link
Member

temoto commented Aug 29, 2014

Any information about how to reproduce memory leak is much appreciated.

@WaltHP
Copy link

WaltHP commented Sep 3, 2014

Was this fixed in 0.15.2?

@temoto
Copy link
Member

temoto commented Sep 3, 2014

Yes, this memory leak is fixed in v0.15.2.
On Sep 3, 2014 8:31 p.m., "Walter A. Boring IV" notifications@github.com
wrote:

Was this fixed in 0.15.2?


Reply to this email directly or view it on GitHub
#122 (comment).

@temoto temoto closed this as completed Sep 25, 2014
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Development

No branches or pull requests

4 participants