Skip to content

Add ability to run asyncio.gather() with concurrency limit #115201

@shidenko97

Description

@shidenko97

Feature or enhancement

Proposal:

The main purpose of the proposal is to be able to run asyncio.gather() with some limit to not overload resources. For example you want to fetch 1000 pages of website, want to do it concurrently, but with some limit, to not to overload both client and server. For such task you can simply use asyncio.gather() with asyncio.Semaphore() and got needed result. It means with limit of 10 (for example) you will have only 10 concurrent tasks running, while other 990 will be still waiting and after every one task finished - new one will be started.

# implementation example
async def asyncio_gather_with_concurrency_limit(*tasks, limit: int):
    semaphore = asyncio.Semaphore(limit)

    async def semaphored_task(task):
        async with semaphore:
            return await task

    return await asyncio.gather(*(semaphored_task(task) for task in tasks))

# usage example
await asyncio_gather_with_concurrency_limit(*[asyncio.sleep(i) for i in range(100)], limit=10)

Has this already been discussed elsewhere?

No response given

Links to previous discussion of this feature:

No response

Metadata

Metadata

Assignees

No one assigned

    Labels

    Projects

    Status

    Done

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions