Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Task was destroyed but it is pending! #7

Closed
rrooggiieerr opened this issue Jun 10, 2024 · 14 comments · Fixed by #8 or #16
Closed

Task was destroyed but it is pending! #7

rrooggiieerr opened this issue Jun 10, 2024 · 14 comments · Fixed by #8 or #16

Comments

@rrooggiieerr
Copy link

I've developed a hand full of HA integrations which connect over the serial port. Currently I'm in the process of migration the different libraries used by these integrations from synchronous pyserial to asynchronous communication.

First I migrated to pyserial-asyncio and after finding out about pyserial-asyncio-fast I used that as a drop in replacement as this seems to be the recommended library to use.

I'm currently working on my XY Screens projector screen integration. When running my unit tests using pyserial-asyncio everything goes fine, but when using pyserial-asyncio-fast I get the following error:

base_events.py:1821 Task was destroyed but it is pending!
task: <Task pending name='Task-52' coro=<SerialTransport._call_connection_lost() done, defined at /Users/rogier/Library/Python/3.12/lib/python/site-packages/serial_asyncio_fast/__init__.py:448> wait_for=<Future pending cb=[_chain_future.<locals>._call_check_cancel() at /opt/homebrew/Cellar/python@3.12/3.12.3/Frameworks/Python.framework/Versions/3.12/lib/python3.12/asyncio/futures.py:387, Task.task_wakeup()]>>

The only thing different in the coded is I change import serial_asyncio to import serial_asyncio_fast as serial_asyncio

@rrooggiieerr
Copy link
Author

I have created a test script which reproduces the error.

Switch between serial_asyncio and serial_asyncio_fast by (un)commenting the appropriate line. With serial_asyncio you don't get the error, with serial_asyncio_fast you do.

Change _SERIAL_PORT to a valid path to a serial port.

import asyncio
# import serial_asyncio
import serial_asyncio_fast as serial_asyncio

_SERIAL_PORT = "/dev/tty.usbserial-110"

async def main():
    # Open the connection
    _, writer = await serial_asyncio.open_serial_connection(
        url=_SERIAL_PORT,
    )

    # Close the connection.
    writer.close()

if __name__ == "__main__":
    try:
        loop = asyncio.new_event_loop()
        loop.run_until_complete(main())
    finally:
        loop.close()

@bdraco
Copy link
Member

bdraco commented Jun 10, 2024

self._loop.create_task(self._call_connection_lost(exc))

It’s a bug that there is no strong reference being held to the task here

Probably other occurrences as well

@bdraco
Copy link
Member

bdraco commented Jun 10, 2024

@bdraco
Copy link
Member

bdraco commented Jun 10, 2024

Probably the timings being different and not blocking the event loop is the reason it can now be garbage collected sooner

@bdraco
Copy link
Member

bdraco commented Jun 10, 2024

The fix in aiolifx/aiolifx#70 could be adapted and reused if you are interested in doing a PR to fix this

@rrooggiieerr
Copy link
Author

@rrooggiieerr
Copy link
Author

@bdraco , is my fix sufficient for you?

@bdraco
Copy link
Member

bdraco commented Jun 14, 2024

Should that fix also be applied to

https://github.com/pyserial/pyserial-asyncio/blob/13694c5a09188faf15f3ae08eae3772f3a707a68/serial_asyncio/__init__.py#L398 ?

Yes but serial_asyncio seems abandoned which is why this lib was created

@bdraco bdraco closed this as completed in #8 Jun 14, 2024
@cottsay
Copy link

cottsay commented Jul 13, 2024

@bdraco, I don't think this was properly addressed. Stashing a reference to the task doesn't mean that it gets executed, it just prevents it from getting destroyed. In the example code, the task still hasn't been run when the program exits and the serial connection still hasn't actually been closed.

The difference in behavior started with pyserial#82, which was merged but not released in pyserial-asyncio. Before that change, the Transport's call to Protocol.connection_lost() was followed by a call to Serial.close() without ever yielding, but pushing the Serial.close() call to the executor means that other stuff can happen between the call to Protocol.connection_lost() and when the underlying serial.Serial object is actually closed. In the example, the "other stuff" is closing the event loop without ever actually executing the task, so it's destroyed (hence the warning).

Previously, await writer.wait_closed() would never finish unless the underlying call to Serial.close() actually happened (or an error occurred), but the aforementioned change (which is included in this package) breaks that promise.

I'd really like to see this issue reopened.

@puddly
Copy link
Collaborator

puddly commented Jul 14, 2024

We ran into this a moment ago. I think pushing close to the executor can be removed: the blocking path was only hit with socket:// URIs. We now special case those and do not use pyserial for connecting to TCP servers.

@bdraco bdraco reopened this Jul 14, 2024
@puddly
Copy link
Collaborator

puddly commented Jul 14, 2024

Test out #16. I believe it should solve this issue.

@puddly
Copy link
Collaborator

puddly commented Jul 16, 2024

@rrooggiieerr I've spent some more time digging here and I think your test case should be amended:

      # Close the connection.
      writer.close()
+     await writer.wait_closed()

close only asks the transport to start closing. The fact that pyserial-asyncio did so synchronously is an implementation detail (and a bug for non-POSIX platforms). The transport emits a connection_lost event when the transport is finally closed, which is what wait_closed waits for. #16 should ensure that this event is emitted on time.

@rrooggiieerr
Copy link
Author

@puddly, I think you are right

@rrooggiieerr
Copy link
Author

@cottsay, thanks for diving into this. Your analysis looks very valid

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
4 participants