Skip to content


"let _ = foo();" is differently effectful from just "foo();" #2735

bblum opened this Issue · 4 comments

2 participants


See #2734. In that sample code, which segfaults, removing the characters "let _ =" causes it to no longer segfault. No idea why; maybe it causes it to optimise away the destructor call?

@catamorphism catamorphism was assigned

I'll see if this is still a bug now.

@catamorphism catamorphism added a commit that closed this issue
@catamorphism catamorphism Test for issue 2735
This probably doesn't test the actual bug, but the fix for
issue 2734 probably camouflages the actual bug (since the
effect of the #2734 test case is now "do nothing observable"
rather than "segfault").

Closes #2735
@bblum bblum reopened this

This is still in the air. The following code prints "First" then "Second", but without the let _ =, it prints "Second" then "First".

class defer {
    f: fn@();
    new(f: fn@()) {
        self.f = f;
    drop { self.f(); }

fn main() {
    let _ = do defer {

I'm not sure if the answer is "Well, too bad, there are no guarantees on when destructors get run", but I want to add this defer thing to libcore, and can't in good conscience as long as its behaviour is unpredictable.


I have a fix for this, which I'm testing. However, for reference, what you want to do to get the behavior that bblum actually wanted was to write:

let _x = do ...

which means the destructor for the RHS won't run until the end of the enclosing scope, since _x is a perfectly cromulent identifier.


My opinion is that let _ = e; should behave just like e; Feel free to argue about it, but for now I'm closing this.

@catamorphism catamorphism was unassigned by bblum
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Something went wrong with that request. Please try again.