Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

DEPR: Change .ix DeprecationWarning -> FutureWarning #26438

Merged
merged 2 commits into from May 19, 2019

Conversation

@jorisvandenbossche
Copy link
Member

@jorisvandenbossche jorisvandenbossche commented May 17, 2019

closes #15152

@codecov
Copy link

@codecov codecov bot commented May 17, 2019

Codecov Report

Merging #26438 into master will decrease coverage by <.01%.
The diff coverage is n/a.

Impacted file tree graph

@@            Coverage Diff             @@
##           master   #26438      +/-   ##
==========================================
- Coverage   91.73%   91.72%   -0.01%     
==========================================
  Files         174      174              
  Lines       50741    50741              
==========================================
- Hits        46548    46544       -4     
- Misses       4193     4197       +4
Flag Coverage Δ
#multiple 90.23% <ø> (ø) ⬆️
#single 41.7% <ø> (-0.09%) ⬇️
Impacted Files Coverage Δ
pandas/core/indexing.py 90.53% <ø> (ø) ⬆️
pandas/io/gbq.py 78.94% <0%> (-10.53%) ⬇️
pandas/core/frame.py 97.02% <0%> (-0.12%) ⬇️

Continue to review full report at Codecov.

Legend - Click here to learn more
Δ = absolute <relative> (impact), ø = not affected, ? = missing data
Powered by Codecov. Last update 1263e1a...3c30184. Read the comment docs.

@codecov
Copy link

@codecov codecov bot commented May 17, 2019

Codecov Report

Merging #26438 into master will decrease coverage by <.01%.
The diff coverage is n/a.

Impacted file tree graph

@@            Coverage Diff             @@
##           master   #26438      +/-   ##
==========================================
- Coverage   91.73%   91.72%   -0.01%     
==========================================
  Files         174      174              
  Lines       50741    50741              
==========================================
- Hits        46548    46544       -4     
- Misses       4193     4197       +4
Flag Coverage Δ
#multiple 90.23% <ø> (ø) ⬆️
#single 41.7% <ø> (-0.09%) ⬇️
Impacted Files Coverage Δ
pandas/core/indexing.py 90.53% <ø> (ø) ⬆️
pandas/io/gbq.py 78.94% <0%> (-10.53%) ⬇️
pandas/core/frame.py 97.02% <0%> (-0.12%) ⬇️

Continue to review full report at Codecov.

Legend - Click here to learn more
Δ = absolute <relative> (impact), ø = not affected, ? = missing data
Powered by Codecov. Last update 1263e1a...bb93f63. Read the comment docs.

@jorisvandenbossche
Copy link
Member Author

@jorisvandenbossche jorisvandenbossche commented May 17, 2019

Checked some of the logs, and I don't think there are remaining warnings un-catched.

@@ -255,6 +255,7 @@ Other API Changes
Deprecations
~~~~~~~~~~~~

- The deprecated ``.ix[]`` indexer now raises a more visible FutureWarning instead of DeprecationWarning (:issue:`26438`).
Copy link
Contributor

@jreback jreback May 18, 2019

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

double backticks on FutureWarning & DeprecationWarning. I would consider making this into a sub-section note and/or add to the headlines to make more visible.

@@ -1137,40 +1137,40 @@ def test_ix_align(self):
df = df_orig.copy()

with catch_warnings(record=True):
Copy link
Contributor

@jreback jreback May 18, 2019

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I think we should change these to assert_produces_warning(FutureWarning) as its more consistent with our style (could be a followup too)

Copy link
Member

@simonjayhawkins simonjayhawkins May 18, 2019

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

i'm not so sure.. shouldn't we only require assert_produces_warning for a test that specifically tests that the warning is raised (preferably with a message check) and use @pytest.mark.filterwarnings for all the tests where it's encountered?

Copy link
Contributor

@jreback jreback May 18, 2019

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

we are inconsistent in the code base for sure. but I think we mostly use assert_produces_warning, meaning if we suddently took out the warning (or changed it), these should fail. checking the test of the message is ok maybe in a single case (otherwise this is overkill)

Copy link
Member

@simonjayhawkins simonjayhawkins May 18, 2019

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

yeah... i think we do agree, i.e. that with catch_warnings should not be used and replaced with either assert_produces_warnings or @pytest.mark.filterwarnings. maybe a different viewpoint on which and how often each are used?

Copy link
Member Author

@jorisvandenbossche jorisvandenbossche May 18, 2019

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Whatever the discussion / decision what is best here (for new code), I don't think it is worth updating code that we won't touch until we remove it in some months / next year.

On the actual discussion, I think indeed rather the filterwarnings with pytest is the way to go for those nowadays (certainly if you need to apply it to a lot of existing cases). Of course in addition to a few tests with assert_produces_warning to actually test the warning.

Copy link
Member

@simonjayhawkins simonjayhawkins May 18, 2019

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Whatever the discussion / decision what is best here (for new code), I don't think it is worth updating code that we won't touch until we remove it in some months / next year.

agreed. as @jreback said, could be a follow-up. I don't mind looking into that.

Copy link
Member Author

@jorisvandenbossche jorisvandenbossche May 18, 2019

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

could be a follow-up. I don't mind looking into that.

You are free to spend your time how you want of course ;-) but I personally don't think it is worth the time. We will be removing this code shortly anyhow. There are plenty of other testing related things that are more useful IMO.

And wanted to suggest: like documenting our standard what to do in those case, but it is actually already quite good documented: http://pandas-docs.github.io/pandas-docs-travis/development/contributing.html#testing-warnings (and so exactly how you suggested it above)

Copy link
Member

@simonjayhawkins simonjayhawkins May 18, 2019

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

i'll need to check, but from memory I think .ix is used in a few tests to generate the expected result for testing the other indexers.

So maybe the first step would be to eliminate these, (would need to be done before .ix deprecation anyway). This would allow the removal of some catch_warnings.

I would guess that in theory it should be possible to remove all .ix occurrences from test files outside the test_ix module.

Copy link
Contributor

@jreback jreback May 18, 2019

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

yep that would be great; best to just use iloc for the expected

@jreback jreback merged commit 5a286e4 into pandas-dev:master May 19, 2019
11 checks passed
@jreback
Copy link
Contributor

@jreback jreback commented May 19, 2019

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Linked issues

Successfully merging this pull request may close these issues.

4 participants