Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Support parallel tests in verbose progress reporting #2223

Merged
merged 2 commits into from
Apr 20, 2020

Conversation

wjordan
Copy link
Contributor

@wjordan wjordan commented Apr 14, 2020

Adds a Minitest plugin to fix verbose progress reporting when running parallel tests.

The default verbose progress reporter interleaves results when test are executed in parallel threads (e.g., in test classes containing parallelize_me!), making the printed results entirely useless because the duration/results printed no longer correspond to the correct test names. This made it impossible to debug slow/hanging tests in the CI output and has probably caused all sorts of confusion.

The plugin fixes this by printing when another test starts in parallel, and re-prints the test name along with the result when the test finishes.

For example, a CI failure on this PR had a test hang, causing the CI action to time out after 10 minutes. The test log shows that the culprit is TestIntegrationSingle#test_siginfo_thread_print, since the line TestIntegrationSingle#test_siginfo_thread_print = … showing the test started never had a matching line showing the test finished.

Your checklist for this pull request

  • I have reviewed the guidelines for contributing to this repository.
  • I have added an entry to History.md if this PR fixes a bug or adds a feature. If it doesn't need an entry to HISTORY.md, I have added [changelog skip] the pull request title.
  • I have added appropriate tests if this PR fixes a bug or adds a feature.
  • My pull request is 100 lines added/removed or less so that it can be easily reviewed.
  • If this PR doesn't need tests (docs change), I added [ci skip] to the title of the PR.
  • If this closes any issues, I have added "Closes #issue" to the PR description or my commit messages.
  • I have updated the documentation accordingly.
  • All new and existing tests passed, including Rubocop.

@MSP-Greg
Copy link
Member

LGTM

I was pretty sure it would work, but a check would be easy to do.

I added the PR to my fork and then added the seed value from the failed macOS 2.5 job. All the MRI jobs failed except the Ruby master jobs, and one outlier, the Ubuntu 2.2 job. Also, all the 'required' non-MRI jobs passed. Interesting.

A quick look at the logs seems to show two different causes for the failures? I think.

See https://github.com/MSP-Greg/puma/runs/587110718

test/helper.rb Outdated
@@ -11,6 +11,8 @@
end
end

$LOAD_PATH.unshift('test')
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Can we omit this? I prefer to not touch the loadpath and leave it on the user to set it up correctly.

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

The load-path change was so the class would be auto-loaded by Minitest's load_plugin logic. Updated to manually load the plugin in e299737.

@nateberkopec nateberkopec added the waiting-for-review Waiting on review from anyone label Apr 14, 2020
@nateberkopec
Copy link
Member

pinging @zenspider to get thoughts + opinion on approach + upstream potential

@zenspider
Copy link

Counter patch incoming...

@wjordan
Copy link
Contributor Author

wjordan commented Apr 14, 2020

A direct patch of ProgressReporter could look something like this: wjordan/minitest@0fb53d3

@zenspider
Copy link

@zenspider
Copy link

I'm on irc and slack.seattlerb.org right now if you want to discuss realtime.

@zenspider
Copy link

zenspider commented Apr 17, 2020

I talked to Aaron a bit today and I no longer think that any change needs to be made (here or in minitest). Verbose mode is an opt-in flag and it just doesn't make sense when run in parallel. Instead, use MT_CPU=1 when you run your tests to serialize the run. Then there's no problems with IO, you still get to debug your slow test situation, and there doesn't need to be extra output or extra mutex wrangling.

ETA: if you do insist on such a thing (ie, tests only fail in parallel), then I would say verbose mode isn't doing you right and there are other better ways to figure out what's going on.

@wjordan
Copy link
Contributor Author

wjordan commented Apr 17, 2020

Verbose mode is an opt-in flag and it just doesn't make sense when run in parallel.

If this is the case, that particular combination of options should probably raise an argument error instead of producing incorrect output.

if you do insist on such a thing (ie, tests only fail in parallel), then I would say verbose mode isn't doing you right and there are other better ways to figure out what's going on.

If you have something in mind, can you be more specific? The custom verbose-parallel reporter in this PR was the 'best' example I've been able to come up with without sacrificing performance dropping down to single-thread, if there's something better that already exists then that could be used instead.

@nateberkopec nateberkopec merged commit 24adaa0 into puma:master Apr 20, 2020
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
maintenance waiting-for-review Waiting on review from anyone
Projects
None yet
Development

Successfully merging this pull request may close these issues.

None yet

4 participants