Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Print a reminder when `--fail-fast` is on #1134

Closed
mightyiam opened this Issue Nov 30, 2016 · 11 comments

Comments

Projects
None yet
4 participants
@mightyiam
Copy link
Contributor

mightyiam commented Nov 30, 2016

When a run stops due to a fail and when --fail-fast was provided, I'd like there to be a bold notice somewhere, perhaps at the bottom of the output, reminding me that --fail-fast is on, so that I will never mistakenly think that I have only one failing test.

@sindresorhus

This comment has been minimized.

Copy link
Member

sindresorhus commented Nov 30, 2016

I agree we should make it clear, maybe instead by showing the number of pending tests that didn't run?

@mightyiam

This comment has been minimized.

Copy link
Contributor Author

mightyiam commented Nov 30, 2016

That would be a welcome addition.

@ThomasBem

This comment has been minimized.

Copy link
Contributor

ThomasBem commented Dec 9, 2016

I would like to give this a shot. I am not sure exactly how to go about it yet though, so any pointers are appreciated.

Sindre, for your idea of showing pending tests got any idea how would that work? From what i have seen so far it looks like once a test fails with the --fail-fast flag enabled Ava collects the test results that have been generated so far then starts teardown.

@novemberborn

This comment has been minimized.

Copy link
Member

novemberborn commented Dec 19, 2016

Hi @ThomasBem! The test process that encounters the error simply exits if --fail-fast is enabled. This means the counts in the RunStatus will never add up to testCount.

I think it'd be reasonable to compute a remainingCount when results are processed. Then the mini reporter and verbose reporter could report this number in their finish() implementation.

I'm marking this as assigned, though I appreciate it took a while for somebody to get back to you, so if you're now busy then no worries 😄

@ThomasBem

This comment has been minimized.

Copy link
Contributor

ThomasBem commented Dec 19, 2016

Thanks for the feedback. I'm still interested in seeing if i can figure this out and make a contribution :)

@novemberborn

This comment has been minimized.

Copy link
Member

novemberborn commented Dec 19, 2016

I noticed #1158 while researching this. Whatever we do there, it's likely that we won't have determined the total test count by the time a failure occurs. Thus the error message should say something like "There are at least 5 tests remaining". There might be more, but we just don't know.

@mightyiam

This comment has been minimized.

Copy link
Contributor Author

mightyiam commented Dec 19, 2016

"`--fail-fast` is on. Any number of tests may have been skipped" would be fine as a start, no?

@ThomasBem

This comment has been minimized.

Copy link
Contributor

ThomasBem commented Dec 20, 2016

So i have implemented some code that should have the test reporter spit out what @mightyiam suggested.

"--fail-fast is on. Any number of tests may have been skipped"

I have added a few tests and things seem to be working alright.

But is there a way for me to run the Ava code that i now have locally as a test-runner for a separate test project just to verify that everything works as intended all the way through?

@mightyiam

This comment has been minimized.

Copy link
Contributor Author

mightyiam commented Dec 21, 2016

@ThomasBem to install AVA that's in the working directory, in whatever project, I would do npm pack in the working directory and then in the 'whatever project', do npm install <path to package you got from npm pack>. That's what I would do.

Although this wouldn't replace decent tests in my mind.

@ThomasBem

This comment has been minimized.

Copy link
Contributor

ThomasBem commented Dec 23, 2016

I have posted a PR that fixes this issue under #1160. Please let me know what you guys think :)

@novemberborn

This comment has been minimized.

Copy link
Member

novemberborn commented Dec 24, 2016

But is there a way for me to run the Ava code that i now have locally as a test-runner for a separate test project just to verify that everything works as intended all the way through?

npm link is your friend.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
You can’t perform that action at this time.