Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

satpy_cf_nc reader cannot read FCI file written with cf writer #2680

Closed
gerritholl opened this issue Dec 12, 2023 · 13 comments · Fixed by #2688
Closed

satpy_cf_nc reader cannot read FCI file written with cf writer #2680

gerritholl opened this issue Dec 12, 2023 · 13 comments · Fixed by #2688
Assignees
Labels

Comments

@gerritholl
Copy link
Collaborator

Describe the bug

Loading FCI data and writing it to a NetCDF file with the cf writer, then trying to read this file with the satpy_cf_nc reader, results in ValueError.

To Reproduce

import hdf5plugin
from satpy import Scene
from glob import glob
from satpy.utils import debug_on; debug_on()
filenames = glob("/media/nas/x23352/MTG/FCI/L1c/2023/12/04/W_XX-EUMETSAT-Darmstadt,IMG+SAT,MTI1+FCI-1C-RRAD-FDHSI-FD--CHK-BODY--DIS-NC4E_C_EUMT_20231204*_IDPFI_OPE_20231204*_20231204*_N_JLS_C_0050_*.nc")
sc = Scene(filenames=filenames, reader="fci_l1c_nc")
sc.load(["vis_06"], calibration="counts")
sc.save_dataset(
        "vis_06",
        "/media/nas/x21308/scratch/FCI/{platform_name}-{sensor}-{start_time:%Y%m%d%H%M%S}-{end_time:%Y%m%d%H%M%S}.nc",
        encoding={"vis_06": {"dtype": "int16", "zlib": True}},
        include_lonlats=False)
sc2 = Scene(filenames=["/media/nas/x21308/scratch/FCI/MTG-I1-fci-20231204081000-20231204082000.nc"], reader=["satpy_cf_nc"])
sc2.load(["vis_06"])

Expected behavior

I expect that sc2 loads the dataset correctly.

Actual results
Text output of actual results or error messages including full tracebacks if applicable.

I've skipped the verbose reading part in the log:

[DEBUG: 2023-12-12 11:45:38 : satpy.readers.yaml_reader] Requested orientation for Dataset None is 'native' (default). No flipping is applied.
[DEBUG: 2023-12-12 11:45:38 : satpy.writers] Reading ['/data/gholl/checkouts/satpy/satpy/etc/writers/cf.yaml']
[INFO: 2023-12-12 11:45:38 : satpy.writers.cf_writer] Saving datasets to NetCDF4/CF.
/data/gholl/checkouts/satpy/satpy/cf/attrs.py:212: DeprecationWarning: datetime.datetime.utcnow() is deprecated and scheduled for removal in a future version. Use timezone-aware objects to represent datetimes in UTC: datetime.datetime.now(datetime.UTC).
  _history_create = "Created by pytroll/satpy on {}".format(datetime.datetime.utcnow())
/data/gholl/checkouts/satpy/satpy/writers/cf_writer.py:274: UserWarning: dtype uint16 not compatible with CF-1.7.
  grouped_datasets, header_attrs = collect_cf_datasets(list_dataarrays=datasets,  # list of xr.DataArray
/data/gholl/mambaforge/envs/py312/lib/python3.12/site-packages/xarray/backends/netCDF4_.py:510: DeprecationWarning: NumPy will stop allowing conversion of out-of-bound Python integers to integer arrays.  The conversion of 65535 to int16 will fail in the future.
For the old behavior, usually:
    np.array(value).astype(dtype)
will give the desired result (the cast overflows).
  nc4_var = self.ds.createVariable(
[DEBUG: 2023-12-12 11:45:51 : satpy.readers.yaml_reader] Reading ('/data/gholl/checkouts/satpy/satpy/etc/readers/satpy_cf_nc.yaml',)
[DEBUG: 2023-12-12 11:45:51 : satpy.readers.yaml_reader] Assigning to satpy_cf_nc: ['/media/nas/x21308/scratch/FCI/MTG-I1-fci-20231204081000-20231204082000.nc']
/data/gholl/checkouts/satpy/satpy/readers/satpy_cf_nc.py:240: DeprecationWarning: The truth value of an empty array is ambiguous. Returning False, but in future this will result in an error. Use `array.size > 0` to check that an array is not empty.
  if "modifiers" in ds_info and not ds_info["modifiers"]:
[DEBUG: 2023-12-12 11:45:51 : satpy.readers.satpy_cf_nc] Getting data for: vis_06
/data/gholl/mambaforge/envs/py312/lib/python3.12/site-packages/xarray/core/dataset.py:278: UserWarning: The specified chunks separate the stored chunks along dimension "y" starting at index 4096. This could degrade performance. Instead, consider rechunking after loading.
  warnings.warn(
/data/gholl/mambaforge/envs/py312/lib/python3.12/site-packages/xarray/core/dataset.py:278: UserWarning: The specified chunks separate the stored chunks along dimension "x" starting at index 4096. This could degrade performance. Instead, consider rechunking after loading.
  warnings.warn(
[ERROR: 2023-12-12 11:45:51 : satpy.readers.yaml_reader] Could not load dataset 'DataID(name='vis_06', wavelength=WavelengthRange(min=0.59, central=0.64, max=0.69, unit='µm'), resolution=1000, calibration=<5>, modifiers=())': dictionary update sequence element #0 has length 1; 2 is required
Traceback (most recent call last):
  File "/data/gholl/checkouts/satpy/satpy/readers/yaml_reader.py", line 823, in _load_dataset_with_area
    ds = self._load_dataset_data(file_handlers, dsid, **kwargs)
         ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/data/gholl/checkouts/satpy/satpy/readers/yaml_reader.py", line 723, in _load_dataset_data
    proj = self._load_dataset(dsid, ds_info, file_handlers, **kwargs)
           ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/data/gholl/checkouts/satpy/satpy/readers/yaml_reader.py", line 715, in _load_dataset
    combined_info = file_handlers[0].combine_info(
                    ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/data/gholl/checkouts/satpy/satpy/readers/file_handlers.py", line 118, in combine_info
    new_dict.update(self._combine_time_parameters(all_infos))
                    ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/data/gholl/checkouts/satpy/satpy/readers/file_handlers.py", line 155, in _combine_time_parameters
    time_params_comb.update(d)
ValueError: dictionary update sequence element #0 has length 1; 2 is required
[WARNING: 2023-12-12 11:45:51 : satpy.scene] The following datasets were not created and may require resampling to be generated: DataID(name='vis_06', wavelength=WavelengthRange(min=0.59, central=0.64, max=0.69, unit='µm'), resolution=1000, calibration=<5>, modifiers=())

Environment Info:

  • OS: openSUSE 15.3
  • Satpy Version: v0.45.0-46-g338817df8
@gerritholl gerritholl added the bug label Dec 12, 2023
@djhoese
Copy link
Member

djhoese commented Dec 12, 2023

I don't have time to find it now but I could have sworn someone (@sfinkens) has a PR to fix this already. I don't think it is even specific to FCI, but rather any data with time_parameters.

@sfinkens
Copy link
Member

Doesn't ring a bell 🤔

@sfinkens
Copy link
Member

So the CF reader needs to convert string-encoded time_parameters back to dict. Like this: https://github.com/pytroll/satpy/blob/main/satpy/readers/satpy_cf_nc.py#L314

@sfinkens
Copy link
Member

I'll make a PR

@djhoese
Copy link
Member

djhoese commented Dec 14, 2023

Doesn't ring a bell 🤔

That seems to be the case 90% of the time I think you've participated in something in the past. 🤦‍♂️ Sorry.

@sfinkens
Copy link
Member

@djhoese Well that just hurts. I'm aware that humans, including me, keep forgetting things. So I'm giving my best making good PR descriptions and documenting issues, so that we can look it up in the future. Also I wonder what data your "90%" estimate is based on...

@mraspaud
Copy link
Member

Doesn't ring a bell 🤔

That seems to be the case 90% of the time I think you've participated in something in the past. 🤦‍♂️ Sorry.

I think this comment was unnecessary, especially for @sfinkens that always takes responsibility and fixes anything he might have broken.

@sfinkens
Copy link
Member

sfinkens commented Dec 18, 2023

Just noticed that the same problem has also been reported in #2609

@gerritholl
Copy link
Collaborator Author

Just noticed that the same problem has also been reported in #2609

My apologies! I missed that one. Closing as duplicate.

@gerritholl gerritholl closed this as not planned Won't fix, can't repro, duplicate, stale Dec 18, 2023
@djhoese
Copy link
Member

djhoese commented Dec 18, 2023

Oh no. I'm so sorry. This was taken completely the opposite way I meant it. Oh my gosh. I'm so sorry. I meant that I assume you've done everything because you've done so much. Not that you don't do enough. I meant that lately whenever I think "who was it that worked on this issue again" I think you and tagged you only to be reminded that you didn't work on this particular problem. It has happened at least 3 times in the last 6 months. I keep doing this because you do so much, not the opposite.

I'm surprised that you both would think I'd say something so hurtful.

@djhoese
Copy link
Member

djhoese commented Dec 18, 2023

To be clear my 90% was referring to how often I make a mistake. My whole comment was self-deprecation and was not trying to say anything bad about you.

@sfinkens
Copy link
Member

Ooooh, what a misunderstanding 🙈 Now that you've explained it, it's actually pretty clear and I wonder why I read it differently... Thanks for the clarification. All good 🤗

I'm surprised that you both would think I'd say something so hurtful.

Probably because I feel bad since I would like to work more on Pytroll but don't have enough time. Sorry for not thinking twice what you might have meant.

@sfinkens
Copy link
Member

And please don't hesitate to ping me on anything :)

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
Projects
Development

Successfully merging a pull request may close this issue.

4 participants