Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

asyncio races on responses #195

Closed
eprst opened this issue Sep 12, 2022 · 1 comment
Closed

asyncio races on responses #195

eprst opened this issue Sep 12, 2022 · 1 comment

Comments

@eprst
Copy link

eprst commented Sep 12, 2022

It is possible to construct a rapid sequence of MPD commands that will confuse asyncio client with response order. Attached example most often makes it parse status response while expecting an idle response, which leads to a crash. Sometimes I saw the reverse, when idle output comes when a client expects status output, but it is pretty rare.

version: python-mpd2 3.0.3

Here's sample output of the script below, with some extra debugging added to the library:

pi@pi3:~/pimpd/pimpd $ /usr/bin/python3 -X dev Test2.py
Connected!
cmd:  status
cmd:  setvol
cmd:  status
cmd:  status
cmd:  setvol
cmd:  status
cmd:  idle
_parse_list:  ['volume: 16', 'repeat: 1', 'random: 0', 'single: 0', 'consume: 0', 'partition: default', 'playlist: 2', 'playlistlength: 1', 'mixrampdb: 0.000000', 'state:
play', 'song: 0', 'songid: 1', 'time: 6575:36000', 'elapsed: 6575.234', 'bitrate: 192', 'duration: 36000.104', 'audio: 44100:24:2', 'nextsong: 0', 'nextsongid: 1']
Traceback (most recent call last):
  File "/usr/lib/python3/dist-packages/mpd/asyncio.py", line 279, in __run
    await result._feed_from(self)
  File "/usr/lib/python3/dist-packages/mpd/asyncio.py", line 45, in _feed_from
    self._feed_line(line)
  File "/usr/lib/python3/dist-packages/mpd/asyncio.py", line 64, in _feed_line
    self.set_result(self._callback(self.__spooled_lines))
  File "/usr/lib/python3/dist-packages/mpd/asyncio.py", line 273, in <lambda>
    result = CommandResult("idle", subsystems, lambda result: self.__distribute_idle_result(self._parse_list(result)))
  File "/usr/lib/python3/dist-packages/mpd/asyncio.py", line 313, in __distribute_idle_result
    idle_changes = list(result)
  File "/usr/lib/python3/dist-packages/mpd/base.py", line 265, in _parse_list
    raise ProtocolError("Expected key '{}', got '{}'".format(seen, key))
mpd.base.ProtocolError: Expected key 'volume', got 'repeat'

script to reproduce:

import asyncio
import logging

from mpd.asyncio import MPDClient

async def connected(client):
    print("Connected!")
    asyncio.create_task(idle_loop(client))
    while True:
        await asyncio.sleep(0.1)
        st = await client.status()
        volume = max(0, int(st.get('volume', 0)))
        await client.setvol(volume+1)
        await asyncio.sleep(0.09)
        await client.status()
        await asyncio.sleep(0.01)
        await client.status()
        await asyncio.sleep(0.01)
        await client.setvol(volume)
        await asyncio.sleep(0.09)
        await client.status()


async def idle_loop(client):
    async for s in client.idle():
        print("Idle change in", s)


async def main():
    client = MPDClient()
    await client.connect("127.0.0.1", 6600)
    await connected(client)


asyncio.run(main())
yakshaver2000 pushed a commit to yakshaver2000/python-mpd2 that referenced this issue Feb 9, 2023
When two commands were issued exactly 0.1 seconds apart, then
`mpd.asyncio.MPDClient.__run` would incorrectly send an "idle" command
to the server, and then attempt to parse the server's response to the
second command as an "idle" response, causing errors like

```
mpd.base.ProtocolError: Expected key 'volume', got 'repeat'
```

as described by Mic92#195, and hangs as described by
Mic92#173.

The root of the problem is that wrapping `asyncio.Queue.get` in
`asyncio.wait_for` can result in a `TimeoutError` exception even when
the queue is not empty, as demonstrated by the following example
(tested with python 3.9.2 and 3.11.2):

```python
import asyncio

TIMEOUT = 0.1

async def get_from_queue(queue):
    try:
        await asyncio.wait_for(
            queue.get(),
            timeout=TIMEOUT
        )
    except asyncio.exceptions.TimeoutError:
        # This is counterintuitive: The "get" operation has timed out,
        # but the queue is not empty!
        assert not queue.empty()
    else:
        # This block is never executed.
        assert False

async def main():
    queue = asyncio.Queue()
    task = asyncio.create_task(get_from_queue(queue))
    await asyncio.sleep(TIMEOUT)
    queue.put_nowait(1)
    await task

asyncio.run(main())
```
yakshaver2000 added a commit to yakshaver2000/python-mpd2 that referenced this issue Feb 9, 2023
When two commands were issued exactly 0.1 seconds apart, then
`mpd.asyncio.MPDClient.__run` would incorrectly send an "idle" command
to the server, and then attempt to parse the server's response to the
second command as an "idle" response, causing errors like

```
mpd.base.ProtocolError: Expected key 'volume', got 'repeat'
```

as described by Mic92#195, and hangs as described by
Mic92#173.

The root of the problem is that wrapping `asyncio.Queue.get` in
`asyncio.wait_for` can result in a `TimeoutError` exception even when
the queue is not empty, as demonstrated by the following example
(tested with python 3.9.2 and 3.11.2):

```python
import asyncio

TIMEOUT = 0.1

async def get_from_queue(queue):
    try:
        await asyncio.wait_for(
            queue.get(),
            timeout=TIMEOUT
        )
    except asyncio.exceptions.TimeoutError:
        # This is counterintuitive: The "get" operation has timed out,
        # but the queue is not empty!
        assert not queue.empty()
    else:
        # This block is never executed.
        assert False

async def main():
    queue = asyncio.Queue()
    task = asyncio.create_task(get_from_queue(queue))
    await asyncio.sleep(TIMEOUT)
    queue.put_nowait(1)
    await task

asyncio.run(main())
```
@Mic92
Copy link
Owner

Mic92 commented Feb 18, 2023

Has been fixed.

@Mic92 Mic92 closed this as completed Feb 18, 2023
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants