Skip to content

Ignore NumPy warnings in compute_meta#5103

Merged
jcrist merged 4 commits intodask:masterfrom
pentschev:ignore-compute-meta-warnings
Jul 29, 2019
Merged

Ignore NumPy warnings in compute_meta#5103
jcrist merged 4 commits intodask:masterfrom
pentschev:ignore-compute-meta-warnings

Conversation

@pentschev
Copy link
Copy Markdown
Member

return None
except Exception:
np.seterr(**np_err)
return None
Copy link
Copy Markdown
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

You might consider using the finally: block here

try:
    ...
except:
    ...
finally:
    np.seterr(**np_err)

The finally block is run regardless of whether or not there was an error.

Copy link
Copy Markdown
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Sure, but there's more code after this block and we still want to keep warnings disabled until just before the last return statement.

Copy link
Copy Markdown
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Instead of handling this ourselves, we could also just use:

with np.errstate(all="ignore"):
    ...

which handles setting and resetting for you.

Copy link
Copy Markdown
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

+1

Copy link
Copy Markdown
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Thanks for the suggestion @jcrist, it's indeed a much cleaner solution!

Fixed in latest commits.

@mrocklin
Copy link
Copy Markdown
Member

Minor issue, but it looks like the black check failed. @pentschev you may want to add the pre-commit hooks to run black automatically when committing.

# from the root dask directory
pip install pre-commit
pre-commit install

@pentschev
Copy link
Copy Markdown
Member Author

The error here happens because the test failing doesn't get a RuntimeWarning in the __array_function__-enabled build only, but that warning is raised and it's not coming from _meta computation. I've spent quite some time trying to figure why that happens, but I can't reproduce it locally no matter what.

@jrbourbeau
Copy link
Copy Markdown
Member

@pentschev I wasn't able to reproduce the failure locally either, so I restarted the failing Travis build which is now passing ¯_(ツ)_/¯

@pentschev
Copy link
Copy Markdown
Member Author

Damn, I've been very unlucky lately with non-reproducible behavior. Thanks for triggering the build again @jrbourbeau!

@jcrist
Copy link
Copy Markdown
Member

jcrist commented Jul 29, 2019

LGTM, merging.

@jcrist jcrist merged commit aa5d4ac into dask:master Jul 29, 2019
@pentschev
Copy link
Copy Markdown
Member Author

Thanks fro reviews everyone, thanks for merging @jcrist!

@estebanag
Copy link
Copy Markdown
Contributor

I've just noticed that the following generates a warning (using dask version 2.3.0):

import dask.array as da
import numpy as np

data0 = da.zeros((3, 10, 10), chunks=(3, 2, 2))
data1 = da.map_blocks(lambda x: np.mean(x, axis=0), data0, dtype=data0.dtype, drop_axis=0)

Output:

numpy/core/fromnumeric.py:3257: RuntimeWarning: Mean of empty slice

Was this supposed to be silenced as well? This is not a problem with version 1.2.2.

Thanks!

Details dask==2.3.0 numpy==1.17.0 pkg-resources==0.0.0 toolz==0.10.0

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

Bogus divide by zero warning when using dask array scalars

5 participants