-
Notifications
You must be signed in to change notification settings - Fork 87
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Fix conda test failing #3607
Fix conda test failing #3607
Conversation
…evalml into 3600-Undo-3.9-Testskips
Codecov Report
@@ Coverage Diff @@
## main #3607 +/- ##
=======================================
- Coverage 99.7% 99.7% -0.0%
=======================================
Files 335 335
Lines 33503 33505 +2
=======================================
+ Hits 33382 33383 +1
- Misses 121 122 +1
Continue to review full report at Codecov.
|
# No prophet, ARIMA, and vowpalwabbit | ||
expected_components = all_requirements_set.difference(not_supported_in_conda) | ||
|
||
elif is_using_windows and not is_running_py_39_or_above: |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
if the issue was conda - can we test dropping these windows + py 39 cases?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Probably! I did it earlier but wasn't sure if that was the right approach. I'll change it.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
if the purpose of your previous PR was to increase coverage for py39 (and these test cases still pass under those scenarios) I would say to drop these cases but keep the conda case because that is still needed. Thank you!
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
LGTM, thanks for the quick fix! @chukarsten will need you to merge due to coverage.
evalml/tests/conftest.py
Outdated
@@ -55,6 +55,10 @@ def pytest_configure(config): | |||
"markers", | |||
"skip_during_conda: mark test to be skipped if running during conda build", | |||
) | |||
config.addinivalue_line( |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
do we need this still?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
oops I should've caught that, thanks!
docs/source/release_notes.rst
Outdated
@@ -12,7 +12,7 @@ Release Notes | |||
* Documentation Changes | |||
* Testing Changes | |||
* Pinned GraphViz version for Windows CI Test :pr:`3596` | |||
* Removed ``pytest.mark.skip_if_39`` pytest marker :pr:`3602` | |||
* Removed ``pytest.mark.skip_if_39`` pytest marker :pr:`3607` |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I would list both PRs for history:
"Removed pytest.mark.skip_if_39
pytest marker :pr:3602
, :pr:3607
"
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Thanks for jumping on this!
Pull Request Description
I changed some things in component_tests/test_utils.py and it made the conda build fail when pushed to main with this error:
"FAILED evalml/tests/component_tests/test_utils.py::test_all_components - Asse...
WARNING:conda_build.build:Tests failed for evalml-0.54.0-hd8ed1ab_2.tar.bz2 - moving package to /home/conda/feedstock_root/build_artifacts/broken
3300
TESTS FAILED: evalml-0.54.0-hd8ed1ab_2.tar.bz2".
So I just reverted the changes I did to test_utils.py and conftest.py. and hopefully that fixes it
After creating the pull request: in order to pass the release_notes_updated check you will need to update the "Future Release" section of
docs/source/release_notes.rst
to include this pull request by adding :pr:123
.