Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Avoid pytest.warns(None) #8718

Merged
merged 5 commits into from Mar 11, 2022
Merged

Avoid pytest.warns(None) #8718

merged 5 commits into from Mar 11, 2022

Conversation

LSturtew
Copy link
Contributor

Adresses issue #8673.

@GPUtester
Copy link
Collaborator

Can one of the admins verify this patch?

Copy link
Member

@jcrist jcrist left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Thanks for working on this. I've marked a few places in your changes where the effect of the test has been removed (the test was to assert no warnings were raised), we'll want to fix those before merging.

There are other places where you added a specific warning class to with pytest.warns (e.g. with pytest.warns(RuntimeError)). Unless it's a case where we're testing that a specific warning is thrown, the usage of with pytest.warns(None) was to silence the warning message thrown (usually from a library dependency). Switching this to e.g. pytest.warns(RuntimeError) changes the meaning of the test to testing that a specific warning is thrown, which will then fail if the library dependency changes their implementation. We'll want to revert these changes and use the following pattern instead:

# Replace
with pytest.warns(None) as record:
    ...
assert not record

# with this
with warnings.capture_warnings(record=True) as record:
   ...
assert not record

# Replace
with pytest.warns(None):
    ...

# with this
with warnings.capture_warnings(record=True):
    ...

This silences all warnings that would be shown in the block, and doesn't require an explicit filter applied. It's also nice and uniform across all places that with pytest.warns(None) was used before.

Comment on lines 4990 to 5003
with pytest.warns(None) as record:
with warnings.catch_warnings():
x = da.ones((3, 10, 10), chunks=(3, 2, 2))
da.map_blocks(lambda y: np.mean(y, axis=0), x, dtype=x.dtype, drop_axis=0)
assert not record

with pytest.warns(None) as record:
with warnings.catch_warnings():
x = da.ones((15, 15), chunks=(5, 5))
(x.dot(x.T + 1) - x.mean(axis=0)).std()
assert not record

with pytest.warns(None) as record:
with warnings.catch_warnings():
x = da.ones((1,), chunks=(1,))
1 / x[0]
assert not record
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Tests like these will need to be changed differently - these were there to assert that no warning was raised, dropping these asserts makes this test not test anything.

GitHub won't let me make a suggestion here, but I believe the following should work:

def test_no_warnings_from_blockwise():
    with warnings.catch_warnings(record=True) as record:
        x = da.ones((3, 10, 10), chunks=(3, 2, 2))
        da.map_blocks(lambda y: np.mean(y, axis=0), x, dtype=x.dtype, drop_axis=0)
    assert not record

    with warnings.catch_warnings(record=True) as record:
        x = da.ones((15, 15), chunks=(5, 5))
        (x.dot(x.T + 1) - x.mean(axis=0)).std()
    assert not record

    with warnings.catch_warnings(record=True) as record:
        x = da.ones((1,), chunks=(1,))
        1 / x[0]
    assert not record

X[idx].compute()
assert len(rec) == 0
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Same here.

result = arr[indexer]
assert len(e) == 0
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Same here.

result = arr[indexer]
assert len(e) == 0 # no
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Same here.

b.take(7, warn=False)
assert len(rec) == 0
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Same here.

meta_nonempty(ser)

assert len(w) == 0
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Same here.

prof.visualize(show=False, save=False)

assert len(record) == 0
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Same here.

p = rprof.visualize(show=False, save=False)
assert len(record) == 0
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Same here.

cprof.visualize(show=False, save=False)

assert len(record) == 0
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Same here.

@@ -56,15 +56,15 @@ def test_nanarg_reductions(dfunc, func):
assert_eq(dfunc(a), func(x))
assert_eq(dfunc(a, 0), func(x, 0))
with pytest.raises(ValueError):
with pytest.warns(None): # All NaN axis
with pytest.warns(RuntimeWarning): # All NaN axis
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

This changes the meaning of the test to testing that a RuntimeWarning is raised (which we don't actually care about. pytest.warns(None) was used before to silence all warnings, not to test anything specific about the warnings. See my top-level comment for a recommended fix.

@LSturtew
Copy link
Contributor Author

@jcrist Thank you for reviewing. I have implemented your requested changes.

@jsignell
Copy link
Member

It looks like this is really close @LSturtew! There is just one failing test: dask/dataframe/io/tests/test_hdf.py::test_to_fmt_warns

Copy link
Member

@jsignell jsignell left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Based on the failure on CI I suspect that this:

with warnings.catch_warnings():

doesn't actually do anything unless it is followed with a

warnings.simplefilter("ignore", ...)

To test this out I just ran this locally:

import warnings
import dask.array as da

inc = lambda x: x + 1
x = da.ones(10, chunks=(5,))
with warnings.catch_warnings():
    y = da.atop(inc, "i", x, "i", dtype=x.dtype)

# /home/julia/dask/dask/array/blockwise.py:281: UserWarning: The da.atop function has moved to da.blockwise
#   warnings.warn("The da.atop function has moved to da.blockwise")

That behavior differs from pytest.warns(None) which just swallows up all the warnings.

@jsignell
Copy link
Member

There aren't actually that many warnings in the tests though, so I suspect that we can actually get rid of lots of these warning catchers.

@jcrist
Copy link
Member

jcrist commented Feb 22, 2022

Yeah, you need to pass in record=True, see my comment above.

...

# Replace
with pytest.warns(None):
    ...

# with this
with warnings.capture_warnings(record=True):
    ...

...

@jsignell
Copy link
Member

Yeah, you need to pass in record=True, see my comment above.

🤦 totally missed that

@pavithraes
Copy link
Member

@LSturtew Looks like this PR is really close! Would you be able to address the final comments?

@jsignell
Copy link
Member

jsignell commented Mar 9, 2022

I'm actually taking a look now since I think this ended up being a little more involved than we had thought. I'm planning on pushing some little changes today.

@jsignell jsignell requested a review from jcrist March 9, 2022 21:53
@jcrist jcrist merged commit e715a4d into dask:main Mar 11, 2022
@pavithraes pavithraes mentioned this pull request May 23, 2022
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Development

Successfully merging this pull request may close these issues.

None yet

5 participants