-
Notifications
You must be signed in to change notification settings - Fork 92
Refactor component test for different environments #2957
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Conversation
Codecov Report
@@ Coverage Diff @@
## main #2957 +/- ##
=======================================
+ Coverage 99.7% 99.7% +0.1%
=======================================
Files 307 307
Lines 29257 29265 +8
=======================================
+ Hits 29166 29174 +8
Misses 91 91
Continue to review full report at Codecov.
|
…lml into 2882_refactor_comp_env_tests
| name: Nightly unit tests, linux | ||
|
|
||
| on: | ||
| schedule: |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
This file was updated just for testing purposes. Will revert before merging! :)
| name: Nightly unit tests, windows | ||
|
|
||
| on: | ||
| schedule: |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Same with this file, updated just for testing purposes. Will revert before merging! :)
|
|
||
| .PHONY: git-test-automl-core | ||
| git-test-automl-core: | ||
| pytest evalml/tests/automl_tests evalml/tests/tuner_tests -n 2 --ignore=evalml/tests/automl_tests/parallel_tests --durations 0 --timeout 300 --doctest-modules --cov=evalml --junitxml=test-reports/git-test-automl-core-junit.xml --has-minimal-dependencies |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
As mentioned in review... found some more unnecessary calls to doctest-modules, cleaning up here but unrelated to the PR.
…lml into 2882_refactor_comp_env_tests
eccabay
left a comment
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Love this, it's going to make adding/changing components so much easier!
| expected_components = [ | ||
| component | ||
| for component in minimum_dependencies_list + requirements_list | ||
| if component not in not_supported_in_conda | ||
| ] |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Nitpick suggestion: What if we used sets for these instead of lists? Since checking membership in a list is slower, it would be faster (but hopefully no less readable) to replace this with something like:
expected_components = minimum_dependencies_list.union(requirements_list).difference(not_supported_in_conda)
For sanity checking I did a quick timeit run comparing the two, using lists with n=10000 took 1.6 seconds and using sets took 0.1!
To make it even faster/more readable, we could also save minimum_dependencies_list+requirements_list/minimum_dependencies_list.union(requirements_list) as its own variable, say all_requirements_list or something.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Yup yup, this is a great suggestion. Will update 😁
freddyaboulton
left a comment
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
@angela97lin Looks good to me! I think keeping track of the expected components with a list will make the error more informative if the test ever fails so although it's more verbose I think it's worth it.
I think @eccabay 's suggestion to use sets is worth considering, esp since there won't be duplicates.
Closes #2882.
Unfortunately, codecov/patch will majorly fail since all of our cases (conda/windows/py39) are not covered by codecov and I've changed those lines :')