You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
#[cfg(test)]mod tests {#[test]fnit_works() -> Result<(),String>{if2 + 2 != 4{Ok(())}else{Err(String::from("two plus two does not equal four"))}}}
(note the !) when this fails, you get
> cargo test
Compiling termination v0.1.0(file:///C:/Users/steve/tmp/termination)Finished dev [unoptimized + debuginfo] target(s) in 1.44s
Running target\debug\deps\termination-4b340b40460d3098.exe
running 1 test
test tests::it_works ... FAILED
failures:
---- tests::it_works stdout ----
Error:"two plus two does not equal four"
thread 'tests::it_works' panicked at 'assertion failed: `(left == right)`
left: `1`,
right: `0`', libtest\lib.rs:326:5
---- tests::it_works stdout ----
The test function returned a termination value that indicates a failure. The detailed report follows.
Error: "two plus two does not equal four"
thread 'tests::it_works' panicked at 'non-successful termination value', libtest\lib.rs:326:5
Or simply
---- tests::it_works stdout ----
Error: "two plus two does not equal four"
thread 'tests::it_works' panicked at 'test function returned termination value that indicates a failure', libtest\lib.rs:326:5
EDIT: Nevermind the first suggestion, that seems like it couldn't work because Termination::report sort of couples printing the message with returning a code.
improve diagnostics for tests with custom return values
This is an attempt at getting the ball rolling to improve the diagnostics for test functions that return custom `impl Termination` values (see rust-lang#52436).
An alternative could be to use `eprintln!`, but including this in the panic message felt nicely consistent with how failing test assertions would be reported.
Let me know what you think!
Consider this test:
(note the
!
) when this fails, you getThis is because it triggers this assert.
can we make this better somehow?
1 != 0
is not super helpful at learning why this test failed.The text was updated successfully, but these errors were encountered: