Multiple simultaneous https requests through http proxy doesn't work #1340
Description
Long story short
Making multiple simultaneous requests to https url through http proxy ends up filling TCPConnector's acquired set and it stops working.
Expected behaviour
That it should be possible to make as many simultaneous requests as it is set in the Connector's limit parameter and that it should reuse connections and/or clean up after use. And if the simultaneous limit is hit, the requests should wait in a queue.
Actual behaviour
TCPConnector keeps SelectorSocketTransports in it's _acquired set. After a while, it inserts a new 'set', until it fills the limit and stops working.
Steps to reproduce
If you set number_of_requests to more than 20, it stops at the first iteration. If you run with 5, for example, it seems to be working until 500 requests are made, then acquired raises to 10 and if kept running, eventually it will hit the limit and stop.
import aiohttp
import asyncio
from aiohttp import ClientResponse
# target = 'https://google.com/'
target = 'https://localhost/'
set_key = ('localhost', 443, True)
# proxy = None
proxy = 'http://localhost:8080'
number_of_requests = 5
iterations = 1000
requests_made = 0
connector = aiohttp.TCPConnector(verify_ssl=False)
client = aiohttp.ClientSession(connector=connector)
async def get():
global requests_made
resp = await client.get(target, proxy=proxy) # type: ClientResponse
await resp.read()
requests_made += 1
async def main():
for _ in range(iterations):
tasks = [asyncio.ensure_future(get()) for _ in range(number_of_requests)]
await asyncio.gather(*tasks)
print('acquired: {} requests: {}'.format(len(connector._acquired[set_key]), requests_made))
loop = asyncio.get_event_loop()
loop.run_until_complete(main())
client.close()Your environment
Linux, Python 3.5.2, aiohttp 1.0.5