Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

isel with 0d dask array index fails #4276

Closed
johnomotani opened this issue Jul 27, 2020 · 0 comments · Fixed by #5873
Closed

isel with 0d dask array index fails #4276

johnomotani opened this issue Jul 27, 2020 · 0 comments · Fixed by #5873

Comments

@johnomotani
Copy link
Contributor

What happened:
If a 0d dask array is passed as an argument to isel(), an error occurs because dask arrays do not have a .item() method. I came across this when trying to use the result of da.argmax() from a dask-backed array to select from the DataArray.

What you expected to happen:
isel() returns the value at the index contained in the 0d dask array.

Minimal Complete Verifiable Example:

import dask.array as daskarray
import numpy as np
import xarray as xr

a = daskarray.from_array(np.linspace(0., 1.))
da = xr.DataArray(a, dims="x")

x_selector = da.argmax(dim=...)

da_max = da.isel(x_selector)

Anything else we need to know?:

I think the problem is here

key = tuple(
k.data.item() if isinstance(k, Variable) and k.ndim == 0 else k for k in key
)

and k.values.item() or int(k.data) would fix my issue, but I don't know the reason for using .item() in the first place, so I'm not sure if either of these would have some undesirable side-effect.

May be related to #2511, but from the code snippet above, I think this is a specific issue of 0d dask arrays rather than a generic dask-indexing issue like #2511.

I'd like to fix this because it breaks the nice new features of argmin() and argmax() if the DataArray is dask-backed.

Environment:

Output of xr.show_versions()

INSTALLED VERSIONS

commit: None
python: 3.7.6 | packaged by conda-forge | (default, Jun 1 2020, 18:57:50)
[GCC 7.5.0]
python-bits: 64
OS: Linux
OS-release: 5.4.0-42-generic
machine: x86_64
processor: x86_64
byteorder: little
LC_ALL: None
LANG: en_GB.UTF-8
LOCALE: en_GB.UTF-8
libhdf5: 1.10.5
libnetcdf: 4.7.4

xarray: 0.16.0
pandas: 1.0.5
numpy: 1.18.5
scipy: 1.4.1
netCDF4: 1.5.3
pydap: None
h5netcdf: None
h5py: 2.10.0
Nio: None
zarr: None
cftime: 1.2.1
nc_time_axis: None
PseudoNetCDF: None
rasterio: None
cfgrib: None
iris: None
bottleneck: None
dask: 2.19.0
distributed: 2.21.0
matplotlib: 3.2.2
cartopy: None
seaborn: None
numbagg: None
pint: 0.13
setuptools: 49.2.0.post20200712
pip: 20.1.1
conda: 4.8.3
pytest: 5.4.3
IPython: 7.15.0
sphinx: None

dcherian added a commit to bzah/xarray that referenced this issue Jun 24, 2022
Pass 0d dask arrays through for indexing.
dcherian added a commit to bzah/xarray that referenced this issue Jun 24, 2022
Pass 0d dask arrays through for indexing.
dcherian added a commit that referenced this issue Mar 15, 2023
* Attempt to fix indexing for Dask

This is a naive attempt to make `isel` work with Dask

Known limitation: it triggers the computation.

* Works now.

* avoid importorskip

* More tests and fixes

* Raise nicer error when indexing with boolean dask array

* Annotate tests

* edit query tests

* [pre-commit.ci] auto fixes from pre-commit.com hooks

for more information, see https://pre-commit.ci

* Fixes #4276

Pass 0d dask arrays through for indexing.

* Add xfail notes.

* backcompat: vendor np.broadcast_shapes

* Small improvement

* fix: Handle scalars properly.

* fix bad test

* Check computes with setitem

* Better error

* Cleanup

* Raise nice error with VectorizedIndexer and dask.

* Add whats-new

---------

Co-authored-by: dcherian <deepak@cherian.net>
Co-authored-by: pre-commit-ci[bot] <66853113+pre-commit-ci[bot]@users.noreply.github.com>
Co-authored-by: Deepak Cherian <dcherian@users.noreply.github.com>
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Development

Successfully merging a pull request may close this issue.

2 participants