Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Spatial subsets may raise out-of-memory error #442

Closed
forman opened this issue Apr 30, 2021 · 1 comment
Closed

Spatial subsets may raise out-of-memory error #442

forman opened this issue Apr 30, 2021 · 1 comment
Assignees
Labels
bug Something isn't working
Milestone

Comments

@forman
Copy link
Member

forman commented Apr 30, 2021

Describe the bug

Core function select_spatial_subset may raise an ArrayMemoryError for large spatial images, e.g. 129600 x 64800 pixels (ESA Land Cover CCI) although the dataset is chunked:

  File "D:\Projects\xcube\test\core\test_select.py", line 156, in test_xy_bbox_with_large_dataset
    ds_subset = select_spatial_subset(ds, xy_bbox=(0., 0., 5.0, 2.5))
  File "D:\Projects\xcube\xcube\core\select.py", line 99, in select_spatial_subset
    geo_coding = geo_coding if geo_coding is not None else GeoCoding.from_xy((x, y), xy_names=xy_names)
  File "D:\Projects\xcube\xcube\core\geocoding.py", line 172, in from_xy
    x, is_lon_normalized = _maybe_normalise_2d_lon(x)
  File "D:\Projects\xcube\xcube\core\geocoding.py", line 468, in _maybe_normalise_2d_lon
    if _is_crossing_antimeridian(lon_var):
  File "D:\Projects\xcube\xcube\core\geocoding.py", line 463, in _is_crossing_antimeridian
    return abs(lon_var.diff(dim=dim_x)).max() > 180.0 or \
  File "D:\Miniconda3\envs\xcube\lib\site-packages\xarray\core\dataarray.py", line 3107, in diff
    ds = self._to_temp_dataset().diff(n=n, dim=dim, label=label)
  File "D:\Miniconda3\envs\xcube\lib\site-packages\xarray\core\dataset.py", line 5489, in diff
    variables[name] = var.isel(**kwargs_end) - var.isel(**kwargs_start)
  File "D:\Miniconda3\envs\xcube\lib\site-packages\xarray\core\variable.py", line 2301, in func
    f(self_data, other_data)
numpy.core._exceptions._ArrayMemoryError: Unable to allocate 62.6 GiB for an array with shape (64800, 129599) and data type float64

Expected behavior

The spatial subset datasets should be created without errors and should also be chunked as the source.

Additional context

xcube 0.8.0

@forman forman added the bug Something isn't working label Apr 30, 2021
@forman forman self-assigned this Apr 30, 2021
@forman forman added this to the 0.8.1 milestone May 4, 2021
@forman
Copy link
Member Author

forman commented May 4, 2021

Should be fixed in 0.,8.1

@forman forman closed this as completed May 4, 2021
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
bug Something isn't working
Projects
None yet
Development

No branches or pull requests

1 participant