-
-
Notifications
You must be signed in to change notification settings - Fork 1.1k
Can't read zero-size netCDF timedelta array #10310
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Comments
@deepers Thanks for raising. Could you please add the output of |
|
@deepers Thanks! Does it trigger the same error if you do an |
@kmuehlbauer It actually works with xr.open_dataarray('/tmp/foo.nc', decode_timedelta=True) Output
|
@kmuehlbauer Actually, I think the above only works because of the lazy loading. xr.open_dataarray('/tmp/foo.nc', decode_timedelta=True).values has output ---------------------------------------------------------------------------
ValueError Traceback (most recent call last)
Cell In[11], line 1
----> 1 xr.open_dataarray('/tmp/foo.nc', decode_timedelta=True).values
File ~/.venv/xarray-bug/lib64/python3.13/site-packages/xarray/core/dataarray.py:815, in DataArray.values(self)
802 @property
803 def values(self) -> np.ndarray:
804 """
805 The array's data converted to numpy.ndarray.
806
(...) 813 to this array may be reflected in the DataArray as well.
814 """
--> 815 return self.variable.values
File ~/.venv/xarray-bug/lib64/python3.13/site-packages/xarray/core/variable.py:516, in Variable.values(self)
513 @property
514 def values(self) -> np.ndarray:
515 """The variable's data as a numpy.ndarray"""
--> 516 return _as_array_or_item(self._data)
File ~/.venv/xarray-bug/lib64/python3.13/site-packages/xarray/core/variable.py:302, in _as_array_or_item(data)
288 def _as_array_or_item(data):
289 """Return the given values as a numpy array, or as an individual item if
290 it's a 0d datetime64 or timedelta64 array.
291
(...) 300 TODO: remove this (replace with np.asarray) once these issues are fixed
301 """
--> 302 data = np.asarray(data)
303 if data.ndim == 0:
304 kind = data.dtype.kind
File ~/.venv/xarray-bug/lib64/python3.13/site-packages/xarray/core/indexing.py:511, in ExplicitlyIndexed.__array__(self, dtype, copy)
506 def __array__(
507 self, dtype: np.typing.DTypeLike = None, /, *, copy: bool | None = None
508 ) -> np.ndarray:
509 # Leave casting to an array up to the underlying array type.
510 if Version(np.__version__) >= Version("2.0.0"):
--> 511 return np.asarray(self.get_duck_array(), dtype=dtype, copy=copy)
512 else:
513 return np.asarray(self.get_duck_array(), dtype=dtype)
File ~/.venv/xarray-bug/lib64/python3.13/site-packages/xarray/core/indexing.py:837, in MemoryCachedArray.get_duck_array(self)
836 def get_duck_array(self):
--> 837 self._ensure_cached()
838 return self.array.get_duck_array()
File ~/.venv/xarray-bug/lib64/python3.13/site-packages/xarray/core/indexing.py:834, in MemoryCachedArray._ensure_cached(self)
833 def _ensure_cached(self):
--> 834 self.array = as_indexable(self.array.get_duck_array())
File ~/.venv/xarray-bug/lib64/python3.13/site-packages/xarray/core/indexing.py:791, in CopyOnWriteArray.get_duck_array(self)
790 def get_duck_array(self):
--> 791 return self.array.get_duck_array()
File ~/.venv/xarray-bug/lib64/python3.13/site-packages/xarray/core/indexing.py:661, in LazilyIndexedArray.get_duck_array(self)
656 # self.array[self.key] is now a numpy array when
657 # self.array is a BackendArray subclass
658 # and self.key is BasicIndexer((slice(None, None, None),))
659 # so we need the explicit check for ExplicitlyIndexed
660 if isinstance(array, ExplicitlyIndexed):
--> 661 array = array.get_duck_array()
662 return _wrap_numpy_scalars(array)
File ~/.venv/xarray-bug/lib64/python3.13/site-packages/xarray/coding/common.py:76, in _ElementwiseFunctionArray.get_duck_array(self)
75 def get_duck_array(self):
---> 76 return self.func(self.array.get_duck_array())
File ~/.venv/xarray-bug/lib64/python3.13/site-packages/xarray/coding/times.py:664, in decode_cf_timedelta(num_timedeltas, units, time_unit)
662 with warnings.catch_warnings():
663 warnings.filterwarnings("ignore", "All-NaN slice encountered", RuntimeWarning)
--> 664 _check_timedelta_range(np.nanmin(num_timedeltas), unit, time_unit)
665 _check_timedelta_range(np.nanmax(num_timedeltas), unit, time_unit)
667 timedeltas = _numbers_to_timedelta(
668 num_timedeltas, unit, "s", "timedeltas", target_unit=time_unit
669 )
File ~/.venv/xarray-bug/lib64/python3.13/site-packages/numpy/lib/_nanfunctions_impl.py:357, in nanmin(a, axis, out, keepdims, initial, where)
352 kwargs['where'] = where
354 if type(a) is np.ndarray and a.dtype != np.object_:
355 # Fast, but not safe for subclasses of ndarray, or object arrays,
356 # which do not implement isnan (gh-9009), or fmin correctly (gh-8975)
--> 357 res = np.fmin.reduce(a, axis=axis, out=out, **kwargs)
358 if np.isnan(res).any():
359 warnings.warn("All-NaN slice encountered", RuntimeWarning,
360 stacklevel=2)
ValueError: zero-size array to reduction operation fmin which has no identity |
@kmuehlbauer My original example runs fine with your fix, but a slightly different case fails with a different error. Note the shape of the array below. It seems like this particular error is specific to the SciPy netCDF backend. If I install netCDF4, then this runs fine. import numpy as np
import xarray as xr
da = xr.DataArray(np.array([[]], dtype='timedelta64[ns]'))
da.to_netcdf('/tmp/foo.nc')
xr.load_dataarray('/tmp/foo.nc', decode_timedelta=True) Output ---------------------------------------------------------------------------
KeyError Traceback (most recent call last)
File ~/.venv/xarray-bug-fix/lib64/python3.13/site-packages/xarray/backends/file_manager.py:211, in CachingFileManager._acquire_with_cache_info(self, needs_lock)
210 try:
--> 211 file = self._cache[self._key]
212 except KeyError:
File ~/.venv/xarray-bug-fix/lib64/python3.13/site-packages/xarray/backends/lru_cache.py:56, in LRUCache.__getitem__(self, key)
55 with self._lock:
---> 56 value = self._cache[key]
57 self._cache.move_to_end(key)
KeyError: [<function _open_scipy_netcdf at 0x7f133492a7a0>, ('/tmp/foo.nc',), 'r', (('mmap', None), ('version', 2)), 'bc5300d8-acbf-4f4e-bc0d-a878d7accb71']
During handling of the above exception, another exception occurred:
TypeError Traceback (most recent call last)
File ~/xarray-bug.py:6
4 da = xr.DataArray(np.array([[]], dtype='timedelta64[ns]'))
5 da.to_netcdf('/tmp/foo.nc')
----> 6 xr.load_dataarray('/tmp/foo.nc', decode_timedelta=True)
File ~/.venv/xarray-bug-fix/lib64/python3.13/site-packages/xarray/backends/api.py:335, in load_dataarray(filename_or_obj, **kwargs)
332 if "cache" in kwargs:
333 raise TypeError("cache has no effect in this context")
--> 335 with open_dataarray(filename_or_obj, **kwargs) as da:
336 return da.load()
File ~/.venv/xarray-bug-fix/lib64/python3.13/site-packages/xarray/backends/api.py:881, in open_dataarray(filename_or_obj, engine, chunks, cache, decode_cf, mask_and_scale, decode_times, decode_timedelta, use_cftime, concat_characters, decode_coords, drop_variables, inline_array, chunked_array_type, from_array_kwargs, backend_kwargs, **kwargs)
710 def open_dataarray(
711 filename_or_obj: str | os.PathLike[Any] | ReadBuffer | AbstractDataStore,
712 *,
(...) 731 **kwargs,
732 ) -> DataArray:
733 """Open an DataArray from a file or file-like object containing a single
734 data variable.
735
(...) 878 open_dataset
879 """
--> 881 dataset = open_dataset(
882 filename_or_obj,
883 decode_cf=decode_cf,
884 mask_and_scale=mask_and_scale,
885 decode_times=decode_times,
886 concat_characters=concat_characters,
887 decode_coords=decode_coords,
888 engine=engine,
889 chunks=chunks,
890 cache=cache,
891 drop_variables=drop_variables,
892 inline_array=inline_array,
893 chunked_array_type=chunked_array_type,
894 from_array_kwargs=from_array_kwargs,
895 backend_kwargs=backend_kwargs,
896 use_cftime=use_cftime,
897 decode_timedelta=decode_timedelta,
898 **kwargs,
899 )
901 if len(dataset.data_vars) != 1:
902 if len(dataset.data_vars) == 0:
File ~/.venv/xarray-bug-fix/lib64/python3.13/site-packages/xarray/backends/api.py:687, in open_dataset(filename_or_obj, engine, chunks, cache, decode_cf, mask_and_scale, decode_times, decode_timedelta, use_cftime, concat_characters, decode_coords, drop_variables, inline_array, chunked_array_type, from_array_kwargs, backend_kwargs, **kwargs)
675 decoders = _resolve_decoders_kwargs(
676 decode_cf,
677 open_backend_dataset_parameters=backend.open_dataset_parameters,
(...) 683 decode_coords=decode_coords,
684 )
686 overwrite_encoded_chunks = kwargs.pop("overwrite_encoded_chunks", None)
--> 687 backend_ds = backend.open_dataset(
688 filename_or_obj,
689 drop_variables=drop_variables,
690 **decoders,
691 **kwargs,
692 )
693 ds = _dataset_from_backend_dataset(
694 backend_ds,
695 filename_or_obj,
(...) 705 **kwargs,
706 )
707 return ds
File ~/.venv/xarray-bug-fix/lib64/python3.13/site-packages/xarray/backends/scipy_.py:333, in ScipyBackendEntrypoint.open_dataset(self, filename_or_obj, mask_and_scale, decode_times, concat_characters, decode_coords, drop_variables, use_cftime, decode_timedelta, mode, format, group, mmap, lock)
331 store_entrypoint = StoreBackendEntrypoint()
332 with close_on_error(store):
--> 333 ds = store_entrypoint.open_dataset(
334 store,
335 mask_and_scale=mask_and_scale,
336 decode_times=decode_times,
337 concat_characters=concat_characters,
338 decode_coords=decode_coords,
339 drop_variables=drop_variables,
340 use_cftime=use_cftime,
341 decode_timedelta=decode_timedelta,
342 )
343 return ds
File ~/.venv/xarray-bug-fix/lib64/python3.13/site-packages/xarray/backends/store.py:44, in StoreBackendEntrypoint.open_dataset(self, filename_or_obj, mask_and_scale, decode_times, concat_characters, decode_coords, drop_variables, use_cftime, decode_timedelta)
30 def open_dataset(
31 self,
32 filename_or_obj: str | os.PathLike[Any] | ReadBuffer | AbstractDataStore,
(...) 40 decode_timedelta=None,
41 ) -> Dataset:
42 assert isinstance(filename_or_obj, AbstractDataStore)
---> 44 vars, attrs = filename_or_obj.load()
45 encoding = filename_or_obj.get_encoding()
47 vars, attrs, coord_names = conventions.decode_cf_variables(
48 vars,
49 attrs,
(...) 56 decode_timedelta=decode_timedelta,
57 )
File ~/.venv/xarray-bug-fix/lib64/python3.13/site-packages/xarray/backends/common.py:312, in AbstractDataStore.load(self)
293 def load(self):
294 """
295 This loads the variables and attributes simultaneously.
296 A centralized loading function makes it easier to create
(...) 309 are requested, so care should be taken to make sure its fast.
310 """
311 variables = FrozenDict(
--> 312 (_decode_variable_name(k), v) for k, v in self.get_variables().items()
313 )
314 attributes = FrozenDict(self.get_attrs())
315 return variables, attributes
File ~/.venv/xarray-bug-fix/lib64/python3.13/site-packages/xarray/backends/scipy_.py:199, in ScipyDataStore.get_variables(self)
197 def get_variables(self):
198 return FrozenDict(
--> 199 (k, self.open_store_variable(k, v)) for k, v in self.ds.variables.items()
200 )
File ~/.venv/xarray-bug-fix/lib64/python3.13/site-packages/xarray/backends/scipy_.py:188, in ScipyDataStore.ds(self)
186 @property
187 def ds(self):
--> 188 return self._manager.acquire()
File ~/.venv/xarray-bug-fix/lib64/python3.13/site-packages/xarray/backends/file_manager.py:193, in CachingFileManager.acquire(self, needs_lock)
178 def acquire(self, needs_lock=True):
179 """Acquire a file object from the manager.
180
181 A new file is only opened if it has expired from the
(...) 191 An open file object, as returned by ``opener(*args, **kwargs)``.
192 """
--> 193 file, _ = self._acquire_with_cache_info(needs_lock)
194 return file
File ~/.venv/xarray-bug-fix/lib64/python3.13/site-packages/xarray/backends/file_manager.py:217, in CachingFileManager._acquire_with_cache_info(self, needs_lock)
215 kwargs = kwargs.copy()
216 kwargs["mode"] = self._mode
--> 217 file = self._opener(*self._args, **kwargs)
218 if self._mode == "w":
219 # ensure file doesn't get overridden when opened again
220 self._mode = "a"
File ~/.venv/xarray-bug-fix/lib64/python3.13/site-packages/xarray/backends/scipy_.py:127, in _open_scipy_netcdf(filename, mode, mmap, version)
124 filename = io.BytesIO(filename)
126 try:
--> 127 return scipy.io.netcdf_file(filename, mode=mode, mmap=mmap, version=version)
128 except TypeError as e: # netcdf3 message is obscure in this case
129 errmsg = e.args[0]
File ~/.venv/xarray-bug-fix/lib64/python3.13/site-packages/scipy/io/_netcdf.py:279, in netcdf_file.__init__(self, filename, mode, mmap, version, maskandscale)
276 self._attributes = {}
278 if mode in 'ra':
--> 279 self._read()
File ~/.venv/xarray-bug-fix/lib64/python3.13/site-packages/scipy/io/_netcdf.py:611, in netcdf_file._read(self)
609 self._read_dim_array()
610 self._read_gatt_array()
--> 611 self._read_var_array()
File ~/.venv/xarray-bug-fix/lib64/python3.13/site-packages/scipy/io/_netcdf.py:692, in netcdf_file._read_var_array(self)
689 data = None
690 else: # not a record variable
691 # Calculate size to avoid problems with vsize (above)
--> 692 a_size = reduce(mul, shape, 1) * size
693 if self.use_mmap:
694 data = self._mm_buf[begin_:begin_+a_size].view(dtype=dtype_)
TypeError: unsupported operand type(s) for *: 'int' and 'NoneType' |
Please open a new issue for that. That's a problem with scipy and netCDF3 Classic. |
Thanks. For the record, the following tests now pass with the fix. import numpy as np
import xarray as xr
da1 = xr.DataArray(np.array([], dtype='timedelta64[ns]'))
da2 = xr.DataArray(np.array([[]], dtype='timedelta64[ns]'))
da3 = xr.DataArray(np.empty((0, 0), dtype='timedelta64[ns]'))
filename = '/tmp/foo.nc'
for da in da1, da2, da3:
for engine in 'netcdf4', 'h5netcdf':
da.to_netcdf(filename, engine=engine)
xr.load_dataarray(filename, decode_timedelta=True)
xr.open_dataarray(filename, decode_timedelta=True).values |
What happened?
When I write a zero-sized timedelta DataArray to a netCDF file, I can't read it back again.
What did you expect to happen?
I should be able to read back the array that was serialized to disk.
Minimal Complete Verifiable Example
MVCE confirmation
Relevant log output
Anything else we need to know?
No response
Environment
INSTALLED VERSIONS
commit: None
python: 3.13.3 (main, Apr 22 2025, 00:00:00) [GCC 15.0.1 20250418 (Red Hat 15.0.1-0)]
python-bits: 64
OS: Linux
OS-release: 6.14.4-300.fc42.x86_64
machine: x86_64
processor:
byteorder: little
LC_ALL: None
LANG: en_CA.UTF-8
LOCALE: ('en_CA', 'UTF-8')
libhdf5: None
libnetcdf: None
xarray: 2025.4.0
pandas: 2.2.3
numpy: 2.2.5
scipy: 1.15.3
netCDF4: None
pydap: None
h5netcdf: None
h5py: None
zarr: None
cftime: None
nc_time_axis: None
iris: None
bottleneck: None
dask: None
distributed: None
matplotlib: None
cartopy: None
seaborn: None
numbagg: None
fsspec: None
cupy: None
pint: None
sparse: None
flox: None
numpy_groupies: None
setuptools: None
pip: 24.3.1
conda: None
pytest: None
mypy: None
IPython: 9.2.0
sphinx: None
The text was updated successfully, but these errors were encountered: