Memory leak with aiohttp.request #1756
Long story short
I'm working on a long running program that makes a significant number of https requests, and the memory of the program increases steadily.
I would expect the memory to remain relatively constant.
For a sample 500 requests the memory increases approximate 8 meg doing just requests.
Steps to reproduce
import logging import asyncio import aiohttp import gc async def get(): response = await aiohttp.request('get', 'https://ddunlop.com') logging.info(response.status) response.close() async def main(loop): await asyncio.sleep(10) for x in range(0, 500): await get() logging.info('done fetching') gc.collect() await asyncio.sleep(60) if __name__ == '__main__': logging.getLogger().setLevel('DEBUG') loop = asyncio.get_event_loop() loop.run_until_complete(main(loop))
aiohttp 2.0.2 inside a
The text was updated successfully, but these errors were encountered:
I was looking into this.
After messing around with objgraph I found an abnormal growth in memory (33 MiB to 500 MiB in a timespan of 12 hours). I originally thought it was my code but some inspection led me to this library instead.
Looking at the growth of object references over time I found the following:
dict 27731 +22390 deque 9730 +8927 method 9802 +8926 ServerDisconnectedError 4848 +4465 weakref 6668 +4464 SSLProtocol 4850 +4464 _SSLProtocolTransport 4850 +4464 ResponseHandler 4849 +4464 _SSLPipe 4850 +4464 SSLObject 4850 +4464
Which was odd given that I don't store references to any of those cases.
I began by investigating the ServerDisconnectError reference chain since I found it odd to store references to those, but that just led me back to
It ends at some list. I checked out the list and it's just a list of
I should note that I do not use
Apologies if this is not helpful.
@fafhrd91 Some more investigation that I did out of curiosity.
It seems to me that you intended
Making sure that the
Hope these findings help you.