New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Memory leak in request #271
Comments
Would you upgrade to Python 3.4.2? |
@GMLudo @asvetlov doing it now |
Upgraded to python 3.4.2, leak (or circular reference) still there. Same tracemalloc stats |
Sorry, I have not many ideas without looking on your code. Do you call |
@asvetlov no, I don't. I call Will try to setup a reproducing script this weekend |
@mpaolini yes, in general |
BTW, unrelated, |
The leak disappears if I frequently |
@mpaolini could you try latest master? |
looks a bit better now. More data will be available in a few hours. Was it 8304f71 ? |
yes, that was it |
The leak is still there. Will upgrade to 3.4.3 later |
upgraded to python 3.4.3 and aiohttp master. Leak is still there. More tracemalloc cruft on its way |
allocation stats recorded by tracemalloc in a 10 minutes time range
|
looks like the
|
oh! |
interesting, but cache dict has only 20 entries. |
true... have a look at the |
i don't think it is |
actually it should be something like: for key, val in connector._conns.items():
print(key, len(val)) |
still no luck:
|
hmm. could you also check how much opened connection you have ("netstat") |
that's expected, this component asynchronously polls an external service for thousands of users |
by the way, number of network connections does not grow |
@mpaolini Could you try aiohttp from master? |
Just upgraded to aiohttp master (on python 3.4.3) leak is still there. It leaks 132 bytes every ~ 5 seconds |
Leak is still happening after upgrade to aiohttp 0.17.3 on python 3.4.3 . It seems it only happens on long-polling requests. I am writing a script to reproduce |
I believe it can be closed. |
I am not sure this leak is solved, I haven't upgraded yet since 0.17.3 |
@mpaolini would you try? |
@asvetlov I am trying on staging now, will let you know how it wokrs |
Thanks a lot! |
@asvetlov still no luck... it leaks around 48 bytes per minute in my staging environment with |
@mpaolini 48 bytes per minute looks like too slow value assuming you have moderate load for your environment. |
@asvetlov see my comment #271 (comment) |
I am fetching tracecmalloc stats from the aiohttp 0.20.0 setup in staging. Is there an IRC or slack where we can discuss in real-time? |
updated tracemalloc diffs in a 7 minutes span <StatisticDiff traceback=<Traceback (<Frame filename='/usr/local/lib/python3.4/site-packages/aiohttp/streams.py' lineno=247>,)> size=5440 (+3808) count=10 (+7)>,
<StatisticDiff traceback=<Traceback (<Frame filename='/usr/local/lib/python3.4/site-packages/aiohttp/client.py' lineno=365>,)> size=30616 (+1696) count=86 (+2)>,
<StatisticDiff traceback=<Traceback (<Frame filename='/usr/local/lib/python3.4/site-packages/aiohttp/client.py' lineno=174>,)> size=3984 (+1328) count=6 (+2)>,
<StatisticDiff traceback=<Traceback (<Frame filename='/usr/local/lib/python3.4/site-packages/aiohttp/client_reqrep.py' lineno=599>,)> size=16856 (+1120) count=57 (+3)>,
<StatisticDiff traceback=<Traceback (<Frame filename='/usr/local/lib/python3.4/site-packages/aiohttp/protocol.py' lineno=221>,)> size=15596 (+992) count=80 (+2)>,
<StatisticDiff traceback=<Traceback (<Frame filename='/usr/local/lib/python3.4/site-packages/aiohttp/parsers.py' lineno=336>,)> size=14408 (+952) count=57 (+3)>,
<StatisticDiff traceback=<Traceback (<Frame filename='/usr/local/lib/python3.4/site-packages/aiohttp/parsers.py' lineno=184>,)> size=29120 (+888) count=58 (+1)>,
<StatisticDiff traceback=<Traceback (<Frame filename='/usr/local/lib/python3.4/site-packages/aiohttp/client.py' lineno=178>,)> size=17880 (+672) count=55 (+2)>,
<StatisticDiff traceback=<Traceback (<Frame filename='/usr/local/lib/python3.4/site-packages/aiohttp/streams.py' lineno=578>,)> size=17384 (+592) count=58 (+2)>,
<StatisticDiff traceback=<Traceback (<Frame filename='/usr/local/lib/python3.4/site-packages/aiohttp/streams.py' lineno=92>,)> size=12064 (+416) count=29 (+1)>,
<StatisticDiff traceback=<Traceback (<Frame filename='/usr/local/lib/python3.4/site-packages/aiohttp/client_reqrep.py' lineno=533>,)> size=8640 (+384) count=48 (+3)>, |
@mpaolini does google hangout chat works for you? |
Finally fixed by #723 |
Yep fixed |
Hi all,
Since I upgraded to 0.14.4 (from 0.9.0) I am experiencing memory leaks in a Dropbox-API longpoller. It is a single process that spawns a few thousands of greenlets. Each greenlet performs a
request()
, that blocks for 30 seconds, then parses the response and dies. Then a new greenlet is spawned.I am running on python 3.4.0, Ubuntu 14.04. I use the connection pool feature, passing the same connector singleton to each
.request()
call.I played with tracemalloc, dumping a
<N>.dump
stat file every minute and found out that the response parser instances keep increasing in number (look at the third line of each stat)tracemalloc reports this stack trace:
Looks like there is something keeping alive those parsers....
Using
force_close=True
on the connector makes no difference.Then I tried calling
gc.collect()
after every single request, and is going much betterbut the leak has notthe leak has disappeared completely-. This means (maybe is an unrelated issue) the library creates more reference cycles thant the CGC can handle.It my well be my own bug, or maybe something to do with python 3.4.0 itself. I'm still digging into it.
The text was updated successfully, but these errors were encountered: