Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Fix pipeline caching for older versions of joblib #687

Merged
merged 12 commits into from Jun 8, 2020

Conversation

chkoar
Copy link
Member

@chkoar chkoar commented Feb 20, 2020

Reference Issue

Fixes #685

@codecov
Copy link

codecov bot commented Feb 20, 2020

Codecov Report

Merging #687 into master will increase coverage by 0.08%.
The diff coverage is 100.00%.

Impacted file tree graph

@@            Coverage Diff             @@
##           master     #687      +/-   ##
==========================================
+ Coverage   96.37%   96.45%   +0.08%     
==========================================
  Files          82       82              
  Lines        4989     4999      +10     
==========================================
+ Hits         4808     4822      +14     
+ Misses        181      177       -4     
Impacted Files Coverage Δ
imblearn/utils/_show_versions.py 100.00% <ø> (ø)
imblearn/pipeline.py 97.91% <100.00%> (+3.97%) ⬆️
imblearn/tests/test_pipeline.py 97.68% <100.00%> (+0.03%) ⬆️
imblearn/utils/tests/test_show_versions.py 100.00% <100.00%> (ø)
imblearn/ensemble/tests/test_forest.py 100.00% <0.00%> (ø)

Continue to review full report at Codecov.

Legend - Click here to learn more
Δ = absolute <relative> (impact), ø = not affected, ? = missing data
Powered by Codecov. Last update 703cee1...fcac0d9. Read the comment docs.

@pep8speaks
Copy link

pep8speaks commented Feb 21, 2020

Hello @chkoar! Thanks for updating this PR. We checked the lines you've touched for PEP 8 issues, and found:

There are currently no PEP 8 issues detected in this Pull Request. Cheers! 🍻

Comment last updated at 2020-06-08 18:30:19 UTC

@@ -37,7 +37,6 @@ jobs:
PANDAS_VERSION: '*'
TEST_DOCSTRINGS: 'true'
JOBLIB_VERSION: '*'
CHECK_WARNINGS: 'true'
Copy link
Member

@glemaitre glemaitre Feb 21, 2020

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Why this change?

Copy link
Member Author

@chkoar chkoar Feb 21, 2020

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

As a test be cause I was getting errors about FutureWarnings. I believe those tests should run only on cron based tasks because otherwise we get unrelated errors to PRs. What to you think?

Copy link
Member

@glemaitre glemaitre Feb 21, 2020

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

We are running those in scikit-learn. It allows to catch the warnings and ensure that you don't introduce new one by calling deprecated stuff. If this is in test, you can always filter a specific warning.

Copy link
Member Author

@chkoar chkoar Feb 22, 2020

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

So, what do you propose? I think that the main thing is not the deprecation and future warnings as errors. We should run those in periodic manner against the latest version of scikit-learn otherwise we are getting errors that are irrelevant to PRs. So, we could test against the latest (released) version of scikit-learn and even a previous one. The latest could be tested in schedule. Or something like that.

imblearn/pipeline.py Outdated Show resolved Hide resolved
@chkoar
Copy link
Member Author

chkoar commented Feb 28, 2020

This PR could be merged after #691

@glemaitre glemaitre merged commit f21bdea into scikit-learn-contrib:master Jun 8, 2020
3 of 13 checks passed
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

3 participants