Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Updated workflows to store pytest runtimes as test artifacts #2448

Merged
merged 8 commits into from Jun 28, 2021

Conversation

angela97lin
Copy link
Contributor

@angela97lin angela97lin commented Jun 28, 2021

@angela97lin angela97lin self-assigned this Jun 28, 2021
@codecov
Copy link

codecov bot commented Jun 28, 2021

Codecov Report

Merging #2448 (694c7f0) into main (301c09f) will not change coverage.
The diff coverage is n/a.

Impacted file tree graph

@@          Coverage Diff          @@
##            main   #2448   +/-   ##
=====================================
  Coverage   99.7%   99.7%           
=====================================
  Files        283     283           
  Lines      25500   25500           
=====================================
  Hits       25400   25400           
  Misses       100     100           

Continue to review full report at Codecov.

Legend - Click here to learn more
Δ = absolute <relative> (impact), ø = not affected, ? = missing data
Powered by Codecov. Last update 301c09f...694c7f0. Read the comment docs.

@angela97lin angela97lin changed the title Store pytest duration in junit xml file Store pytest runtimes as test artifacts Jun 28, 2021
@angela97lin angela97lin changed the title Store pytest runtimes as test artifacts Updated workflows to store pytest runtimes as test artifacts Jun 28, 2021
@@ -3,6 +3,7 @@ python_files = evalml/tests/*
filterwarnings =
ignore::DeprecationWarning
ignore::PendingDeprecationWarning
junit_duration_report = call
Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

This flag is set so that we don't report setup/teardown time, just the call time: https://docs.pytest.org/en/6.2.x/reference.html#confval-junit_duration_report

Makefile Outdated

.PHONY: git-test-other-core
git-test-other-core:
pytest evalml/tests --ignore evalml/tests/automl_tests/ --ignore evalml/tests/tuner_tests/ --ignore evalml/tests/model_understanding_tests/ -n 2 --durations 100 --timeout 300 --doctest-modules --cov=evalml --junitxml=test-reports/junit.xml --doctest-continue-on-failure --has-minimal-dependencies
pytest evalml/tests --ignore evalml/tests/automl_tests/ --ignore evalml/tests/tuner_tests/ --ignore evalml/tests/model_understanding_tests/ -n 2 --durations 100 --timeout 300 --doctest-modules --cov=evalml --junitxml=test-reports/git-test-other-core-junit.xml --doctest-continue-on-failure --has-minimal-dependencies
Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Maybe we don't need it for core tests? 🤷‍♀️

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Why not though?

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Hehe I guess theoretically there shouldn't be too much of a difference in runtime for core-tests that isn't captured in our non-core tests, unless catboost / xgboost / other dependencies are doing something fishy. But touche, por que no when it doesn't cost us much? 😛

Copy link
Collaborator

@chukarsten chukarsten left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

LGTM! Thanks for fixing my copypasta lol

Copy link
Contributor

@freddyaboulton freddyaboulton left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Super cool @angela97lin ! Curious what you think the next step is? Maybe you can share some of your scripts you had written a while ago to parse the xmls into more human-readable formats?

Makefile Outdated

.PHONY: git-test-automl-core
git-test-automl-core:
pytest evalml/tests/automl_tests evalml/tests/tuner_tests -n 2 --ignore=evalml/tests/automl_tests/dask_tests --durations 100 --timeout 300 --doctest-modules --cov=evalml --junitxml=test-reports/junit.xml --doctest-continue-on-failure --has-minimal-dependencies
pytest evalml/tests/automl_tests evalml/tests/tuner_tests -n 2 --ignore=evalml/tests/automl_tests/dask_tests --durations 100 --timeout 300 --doctest-modules --cov=evalml --junitxml=test-reports/git-test-automl-core-junit.xml --doctest-continue-on-failure --has-minimal-dependencies
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Thoughts on setting durations to 0 so that we record every test? Before, I knew we could do something cool like this I limited durations to 100 because I didn't want to flood the console lol

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I like it! 😁

Makefile Outdated

.PHONY: git-test-other-core
git-test-other-core:
pytest evalml/tests --ignore evalml/tests/automl_tests/ --ignore evalml/tests/tuner_tests/ --ignore evalml/tests/model_understanding_tests/ -n 2 --durations 100 --timeout 300 --doctest-modules --cov=evalml --junitxml=test-reports/junit.xml --doctest-continue-on-failure --has-minimal-dependencies
pytest evalml/tests --ignore evalml/tests/automl_tests/ --ignore evalml/tests/tuner_tests/ --ignore evalml/tests/model_understanding_tests/ -n 2 --durations 100 --timeout 300 --doctest-modules --cov=evalml --junitxml=test-reports/git-test-other-core-junit.xml --doctest-continue-on-failure --has-minimal-dependencies
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Why not though?

@angela97lin
Copy link
Contributor Author

@freddyaboulton For sure! I filed #2453 with some of my initial scrappy code, but definitely worth having the scripts in the repo so it's accessible to the whole team whenever we want to try to debug unit test runtimes. I also think it could also be super cool if we could, during a job, calculate the difference in time it took to run on that branch vs the current artifact on main and raise a warning / flag if the runtime significantly increased, but that might be asking for a little bit more 😝

@angela97lin angela97lin merged commit 3d9163d into main Jun 28, 2021
@angela97lin angela97lin deleted the 2396_junit branch June 28, 2021 23:20
@dsherry dsherry mentioned this pull request Jul 2, 2021
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

Record runtime of each unit test as a test artifact
3 participants