Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Race condition on __async_init__ when min_size > 0 #320

Closed
spumer opened this issue Jun 28, 2018 · 4 comments
Closed

Race condition on __async_init__ when min_size > 0 #320

spumer opened this issue Jun 28, 2018 · 4 comments

Comments

@spumer
Copy link

spumer commented Jun 28, 2018

def get_pool(loop=None):
    global pool
    if pool is None:
        pool = asyncpg.create_pool(**config.DTABASE, loop=loop)
    return pool

Task1:

pool = get_pool()
await pool  # init

Task2:

pool = get_pool()
await pool  # init

Got AssertionError for asyncpg 0.15 and InternalClientError for asyncpg 0.16

  File "asyncpg/pool.py", line 356, in _async__init__
    await first_ch.connect()
  File "asyncpg/pool.py", line 118, in connect
    assert self._con is None

Both tasks try to connect PoolConnectionHolder's to ensure min_size connections will be ready.

  1. Task1 create pool
  2. Task1 await ensure min_size on async_init
  3. Task2 get same pool
  4. Task2 see initialization not done
  5. Task1 done await
  6. Task2 await ensure min_size on async_init and failed

Previously i try to use module (cached-property)[https://github.com/pydanny/cached-property], and this use-case broken cause for async properties it's save Future and don't use any additional locks.
And if init failed, exception will be cached too.
This is why i don't await pool when create it, just when i need connection.

Simple workaround is set min_size to 0. Acquire and then connect. Connection acquiring is safe.

@spumer
Copy link
Author

spumer commented Jun 28, 2018

Other workaround is create a lock. But currently for global function is too difficult cause i need pass loop.
Of course i can create temporarly lock, and move init to my get_pool

Like this:

pool = None
_lock = None
async def get_pool(loop=None):
    global pool, _lock

    if pool is None:
        if _lock is None:
            _lock = asyncio.Lock(loop)

        lock = _lock
        async with lock:
            if pool is None:  # now race condition is not possible, check again
                pool = await asyncpg.create_pool(**config.DATABASE, loop=loop)
        _lock = None

    return pool

I think set min_size to 0, it's more simple to my use-case


I hope you fix race condition to more simplified usage.

@elprans
Copy link
Member

elprans commented Jul 3, 2018

Your two examples are not equivalent. Looking at the initial example, you are essentially attempting to await on the same pool object in concurrent tasks. The recommended usage is to await asyncpg.create_pool(), which you seem to do in your second example. Generally, pool initialization is best done in some init task which you do not run concurrently, so as to avoid these kinds of problems.

@1st1
Copy link
Member

1st1 commented Jul 5, 2018

I concur with @elprans: this is an anti-pattern, and while the exception should be nicer, adding locking mechanisms is out of scope of asyncpg. The same problem will arise if you try to share the came Connection object between to racing tasks.

The workaround you propose in #320 (comment) can be implemented in your code and I think it's a good enough solution.

1st1 added a commit that referenced this issue Jul 5, 2018
Raise an InterfaceError if:

* Two or more tasks try to initialize the same Pool object concurrently;

* A pool method, such as fetchval(), is called while the Pool is being
  initialized.

See also issue #320.
1st1 added a commit that referenced this issue Jul 5, 2018
Raise an InterfaceError if:

* Two or more tasks try to initialize the same Pool object concurrently;

* A pool method, such as fetchval(), is called while the Pool is being
  initialized.

See also issue #320.
@elprans
Copy link
Member

elprans commented Jul 10, 2018

Fixed in #325.

@elprans elprans closed this as completed Jul 10, 2018
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Development

No branches or pull requests

3 participants