-
-
Notifications
You must be signed in to change notification settings - Fork 33k
Closed as not planned
Labels
Description
Feature or enhancement
Proposal:
The main purpose of the proposal is to be able to run asyncio.gather()
with some limit to not overload resources. For example you want to fetch 1000 pages of website, want to do it concurrently, but with some limit, to not to overload both client and server. For such task you can simply use asyncio.gather()
with asyncio.Semaphore()
and got needed result. It means with limit of 10 (for example) you will have only 10 concurrent tasks running, while other 990 will be still waiting and after every one task finished - new one will be started.
# implementation example
async def asyncio_gather_with_concurrency_limit(*tasks, limit: int):
semaphore = asyncio.Semaphore(limit)
async def semaphored_task(task):
async with semaphore:
return await task
return await asyncio.gather(*(semaphored_task(task) for task in tasks))
# usage example
await asyncio_gather_with_concurrency_limit(*[asyncio.sleep(i) for i in range(100)], limit=10)
Has this already been discussed elsewhere?
No response given
Links to previous discussion of this feature:
No response
Metadata
Metadata
Assignees
Labels
Projects
Status
Done