-
Notifications
You must be signed in to change notification settings - Fork 132
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Fix from_thread.run(_sync)?
not setting sniffio on asyncio
#524
Fix from_thread.run(_sync)?
not setting sniffio on asyncio
#524
Conversation
448018b
to
7a445f6
Compare
7a445f6
to
c18e9ea
Compare
(rebased onto master to fix the pre-commit isort failure) |
This also fixes a deadlock when using start_blocking_portal("asyncio") on a non-asyncio async backend known to sniffio. Fixes agronholm#523.
49ccd94
to
afd2053
Compare
0da45b0
to
bf35657
Compare
CI is passing, but setting |
bf35657
to
41b500e
Compare
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Let's get this done for v4.0!
tests/test_from_thread.py
Outdated
@pytest.fixture | ||
def portal_backend_name(portal_backend: Any) -> str: | ||
return anyio_backend_name.__wrapped__(portal_backend) # type: ignore[attr-defined] | ||
|
||
|
||
@pytest.fixture | ||
def portal_backend_options(portal_backend: Any) -> dict[str, Any]: | ||
return anyio_backend_options.__wrapped__( # type: ignore[attr-defined] | ||
portal_backend | ||
) |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Do the tests specifically need to replicate the exact options for each backend? If not, you could just parametrize the tests based on anyio.get_all_backends()
.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
sounds good! yeah, this parameterization is probably overly defensive.
i suspect it's super unlikely1 that e.g. from_thread.run(requires_sniffio)
could simultaneously work between two CPython asyncio threads (a test case covered by get_all_backends()
) but not work between e.g. a CPython asyncio thread and a uvloop asyncio thread.
will implement your suggestion.
Footnotes
-
and maybe impossible, given how sniffio is agnostic to CPython asyncio vs. uvloop asyncio ↩
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
the parametrization is now
TestBlockingPortal::test_from_async[asyncio-asyncio]
TestBlockingPortal::test_from_async[asyncio-trio]
TestBlockingPortal::test_from_async[asyncio+uvloop-asyncio]
TestBlockingPortal::test_from_async[asyncio+uvloop-trio]
TestBlockingPortal::test_from_async[trio-asyncio]
TestBlockingPortal::test_from_async[trio-trio]
which is asymmetric, but that should be fine.
tests/test_from_thread.py
Outdated
@pytest.mark.timeout(1) | ||
async def test_from_async( | ||
self, | ||
anyio_backend_name: str, | ||
portal_backend_name: str, | ||
portal_backend_options: dict[str, Any], | ||
) -> None: | ||
"""Test that portals don't deadlock when started/used from async code.""" | ||
|
||
if anyio_backend_name == "trio" and portal_backend_name == "trio": | ||
pytest.xfail("known bug (#525)") | ||
|
||
with start_blocking_portal( | ||
portal_backend_name, portal_backend_options | ||
) as portal: | ||
portal.call(checkpoint) |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
This is an annoying problem. Perhaps this could be removed from this PR, to be figured out later? I don't want pytest-timeout
to block this from being merged.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I think I'll just sidestep the issue of pytest-timeout for now and just rely on the timeout we have on GitHub. I want this merged for v3.7.0.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
sounds good. the only side effect of removing the timeout decorator here is that it will make pytest timeout if this test fails, but as long as folks (continue to) interpret a pytest timeout as a test failure then it shouldn't be too bad in the interim.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
That's just for GitHub Actions, right? Locally it would just hang forever, yes?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
pytest has its own per-test timeout by default, IIRC
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Really? I haven't seen any such thing. All my searches point to pytest-timeout
as the provider of any sort of a timeout effect.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
At any rate, if you could remove pytest-timeout
for the time being, and make the other change I suggested, I could merge this and get it into v3.7.0. I think this is the last fix slated for that minor release.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
i will double check—i faintly recall trio CI having weird failures on PyPy caused by pytest's timeout fault handler segfaulting, but my memory is flaky there.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
right—i misremembered this a bit. pytest has no per-test timeout by default. however one can set faulthandler_timeout
to enable one. if a test takes longer than that, though, pytest doesn't fail it—it just dumps tracebacks.
regardless, i'll implement your suggestion here—we can figure out what's going with pytest-timeout autoloading later.
Thanks! |
With pytest.mark.xfail, the test is still run and will emit xpass if/when it stops failing, i.e. it will detect when the mark.xfail should be removed. In contrast, pytest.xfail immediately force-skips the test, skipping the xpass check. The only remaining use of pytest.xfail (in TestBlockingPortal.test_from_async) needs to remain a pytest.xfail until someone gets around to agronholm#524 (comment) because it will deadlock otherwise.
With pytest.mark.xfail, the test is still run and will emit xpass if/when it stops failing, i.e. it will detect when the mark.xfail should be removed. In contrast, pytest.xfail immediately force-skips the test, skipping the xpass check. The only remaining use of pytest.xfail (in TestBlockingPortal.test_from_async) needs to remain a pytest.xfail until someone gets around to agronholm#524 (comment) because it will deadlock otherwise.
closes #523.