Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Error when creating forcing using datetime noleap type #138

Closed
Jaapel opened this issue Dec 19, 2022 · 0 comments · Fixed by #142
Closed

Error when creating forcing using datetime noleap type #138

Jaapel opened this issue Dec 19, 2022 · 0 comments · Fixed by #142
Assignees
Labels
bug Something isn't working readers / writers issue linked to readers/writters
Milestone

Comments

@Jaapel
Copy link

Jaapel commented Dec 19, 2022

Hi all!

While preparing a forcing from a datasource that has a noleap type, when using pandas>1.4 (1.5.2 in my case), I encounter the following error:

Traceback (most recent call last):
  File "/root/work/.snakemake/scripts/tmp9je2e4p2.downscale_climate_forcing.py", line 97, in <module>
    mod.update(opt=update_options)
  File "/opt/conda/lib/python3.10/site-packages/hydromt/models/model_api.py", line 291, in update
    self._run_log_method(method, **opt[method])
  File "/opt/conda/lib/python3.10/site-packages/hydromt/models/model_api.py", line 153, in _run_log_method
    return func(*args, **kwargs)
  File "/opt/conda/lib/python3.10/site-packages/hydromt_wflow/wflow.py", line 1982, in write_forcing
    ds = ds.sel({"time": slice(start, end)})
  File "/opt/conda/lib/python3.10/site-packages/xarray/core/dataset.py", line 2565, in sel
    query_results = map_index_queries(
  File "/opt/conda/lib/python3.10/site-packages/xarray/core/indexing.py", line 183, in map_index_queries
    results.append(index.sel(labels, **options))
  File "/opt/conda/lib/python3.10/site-packages/xarray/core/indexes.py", line 442, in sel
    indexer = _query_slice(self.index, label, coord_name, method, tolerance)
  File "/opt/conda/lib/python3.10/site-packages/xarray/core/indexes.py", line 207, in _query_slice
    indexer = index.slice_indexer(
  File "/opt/conda/lib/python3.10/site-packages/pandas/core/indexes/base.py", line 6602, in slice_indexer
    start_slice, end_slice = self.slice_locs(start, end, step=step)
  File "/opt/conda/lib/python3.10/site-packages/pandas/core/indexes/base.py", line 6810, in slice_locs
    start_slice = self.get_slice_bound(start, "left")
  File "/opt/conda/lib/python3.10/site-packages/pandas/core/indexes/base.py", line 6726, in get_slice_bound
    return self._searchsorted_monotonic(label, side)
  File "/opt/conda/lib/python3.10/site-packages/pandas/core/indexes/base.py", line 6672, in _searchsorted_monotonic
    return self.searchsorted(label, side=side)
  File "/opt/conda/lib/python3.10/site-packages/pandas/core/base.py", line 1298, in searchsorted
    return algorithms.searchsorted(
  File "/opt/conda/lib/python3.10/site-packages/pandas/core/algorithms.py", line 1665, in searchsorted
    return arr.searchsorted(value, side=side, sorter=sorter)  # type: ignore[arg-type]
  File "src/cftime/_cftime.pyx", line 1432, in cftime._cftime.datetime.__richcmp__
TypeError: cannot compare cftime.DatetimeNoLeap(2039, 1, 2, 0, 1, 0, 0, has_year_zero=True) and Timestamp('2030-01-01 00:00:00') (different calendars)

This is likely due to pandas being more restrictive about datetime formats.

Thanks for all the effort!
Jaap

@Jaapel Jaapel added the bug Something isn't working label Dec 19, 2022
@hboisgon hboisgon added the readers / writers issue linked to readers/writters label Dec 19, 2022
@hboisgon hboisgon linked a pull request Jan 10, 2023 that will close this issue
@hboisgon hboisgon added this to the Release 0.3.0 milestone Mar 1, 2023
@hboisgon hboisgon self-assigned this Mar 14, 2023
laurenebouaziz added a commit that referenced this issue Mar 23, 2023
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
bug Something isn't working readers / writers issue linked to readers/writters
Projects
None yet
Development

Successfully merging a pull request may close this issue.

2 participants