Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Channel context manager eats exceptions when cancelled #746

Closed
oremanj opened this issue Oct 22, 2018 · 2 comments

Comments

Projects
None yet
3 participants
@oremanj
Copy link
Contributor

commented Oct 22, 2018

I ran into this today:

import trio

async def produce(send_channel):
    async with send_channel:
        print("in async with send_channel")
        raise RuntimeError("kaboom")

async def test():
    send_channel, receive_channel = trio.open_memory_channel(0)
    with trio.open_cancel_scope() as scope:
        scope.cancel()
        await produce(send_channel)

trio.run(test)
print("where's the kaboom?")

Prints:

in async with send_channel
where's the kaboom?

I'm guessing this is because channel.aclose() is a checkpoint? Seems like this exception-eating might be the same or an overlapping issue as #455, but it was especially surprising to me, and I think might create user confusion as people start using async with foo_channel: in more and more places.

@smurfix

This comment has been minimized.

Copy link
Contributor

commented Oct 23, 2018

It is. There's not much that Trio can do internally to fix that IMHO, but what about a

def __aexit__(self, *tb):  # and/or __exit__
    with merge_exceptions(*tb):
        foo()

context manager that packages the exception(s) raised by foo, and the traceback if one exists, into a MultiError (or whatever Python's stdlib ends up naming the thing)? At least then we'd have a standard idiom that we can use, and the one additional line of boilerplate won't hurt (too much) (and can be checked for by a linter).

@njsmith

This comment has been minimized.

Copy link
Member

commented Oct 23, 2018

Yeah, this is exactly the same as #455, so I'll close it to consolidate the discussion.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
You can’t perform that action at this time.