Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Mitiff writer is broken #369

Closed
pnuu opened this issue Jul 16, 2018 · 17 comments
Closed

Mitiff writer is broken #369

pnuu opened this issue Jul 16, 2018 · 17 comments

Comments

@pnuu
Copy link
Member

pnuu commented Jul 16, 2018

Describe the bug

Just noticed when writing unittests for my PR (#363) that the mitiffwriter doesn't work at the moment.

To Reproduce

scn.save_dataset(composite_name, writer='mitiff', filename='/tmp/mitiff.tif')

Expected behavior
A file mitiff.tif should be saved in /tmp/ directory.

Actual results

scn.save_dataset('natural', writer='mitiff', filename='/tmp/mitiff.tif')
[DEBUG: 2018-07-16 12:29:43 : satpy.writers] Reading ['/home/lahtinep/Software/miniconda3/lib/python3.6/site-packages/satpy-0.9.1a0.dev0-py3.6.egg/satpy/etc/writers/mitiff.yaml']
[DEBUG: 2018-07-16 12:29:43 : satpy.writers.mitiff] Starting in mitiff save_dataset ... 
[WARNING: 2018-07-16 12:29:43 : satpy.writers.mitiff] Unset save_dir. Use: ./
/home/lahtinep/Software/miniconda3/lib/python3.6/site-packages/pyresample/kd_tree.py:924: RuntimeWarning: invalid value encountered in sqrt
  mask=mask)
[DEBUG: 2018-07-16 12:29:45 : satpy.writers.mitiff] create_opts: ./
---------------------------------------------------------------------------
KeyError                                  Traceback (most recent call last)
<ipython-input-48-762780f9f91a> in <module>()
----> 1 lcl.save_dataset('natural', writer='mitiff', filename='/tmp/mitiff.tif')

~/Software/miniconda3/lib/python3.6/site-packages/satpy-0.9.1a0.dev0-py3.6.egg/satpy/scene.py in save_dataset(self, dataset_id, filename, writer, overlay, compute, **kwargs)
   1008         return writer.save_dataset(self[dataset_id], filename=filename,
   1009                                    overlay=overlay, compute=compute,
-> 1010                                    **save_kwargs)
   1011 
   1012     def save_datasets(self, writer="geotiff", datasets=None, compute=True,

~/Software/miniconda3/lib/python3.6/site-packages/satpy-0.9.1a0.dev0-py3.6.egg/satpy/writers/mitiff.py in save_dataset(self, dataset, filename, fill_value, compute, base_dir, **kwargs)
    116 
    117         if compute:
--> 118             return delayed.compute()
    119         return delayed
    120 

~/Software/miniconda3/lib/python3.6/site-packages/dask/base.py in compute(self, **kwargs)
    154         dask.base.compute
    155         """
--> 156         (result,) = compute(self, traverse=False, **kwargs)
    157         return result
    158 

~/Software/miniconda3/lib/python3.6/site-packages/dask/base.py in compute(*args, **kwargs)
    400     keys = [x.__dask_keys__() for x in collections]
    401     postcomputes = [x.__dask_postcompute__() for x in collections]
--> 402     results = schedule(dsk, keys, **kwargs)
    403     return repack([f(r, *a) for r, (f, a) in zip(results, postcomputes)])
    404 

~/Software/miniconda3/lib/python3.6/site-packages/dask/threaded.py in get(dsk, result, cache, num_workers, **kwargs)
     73     results = get_async(pool.apply_async, len(pool._pool), dsk, result,
     74                         cache=cache, get_id=_thread_get_id,
---> 75                         pack_exception=pack_exception, **kwargs)
     76 
     77     # Cleanup pools associated to dead threads

~/Software/miniconda3/lib/python3.6/site-packages/dask/local.py in get_async(apply_async, num_workers, dsk, result, cache, get_id, rerun_exceptions_locally, pack_exception, raise_exception, callbacks, dumps, loads, **kwargs)
    519                         _execute_task(task, data)  # Re-execute locally
    520                     else:
--> 521                         raise_exception(exc, tb)
    522                 res, worker_id = loads(res_info)
    523                 state['cache'][key] = res

~/Software/miniconda3/lib/python3.6/site-packages/dask/compatibility.py in reraise(exc, tb)
     67         if exc.__traceback__ is not tb:
     68             raise exc.with_traceback(tb)
---> 69         raise exc
     70 
     71 else:

~/Software/miniconda3/lib/python3.6/site-packages/dask/local.py in execute_task(key, task_info, dumps, loads, get_id, pack_exception)
    288     try:
    289         task, data = loads(task_info)
--> 290         result = _execute_task(task, data)
    291         id = get_id()
    292         result = dumps((result, id))

~/Software/miniconda3/lib/python3.6/site-packages/dask/local.py in _execute_task(arg, cache, dsk)
    269         func, args = arg[0], arg[1:]
    270         args2 = [_execute_task(a, cache) for a in args]
--> 271         return func(*args2)
    272     elif not ishashable(arg):
    273         return arg

~/Software/miniconda3/lib/python3.6/site-packages/satpy-0.9.1a0.dev0-py3.6.egg/satpy/writers/mitiff.py in _delayed_create(create_opts, dataset)
     76                     kwargs['name'] = dataset.attrs['name']
     77                 if 'start_time' not in kwargs:
---> 78                     kwargs['start_time'] = dataset.attrs['start_time']
     79                 if 'sensor' not in kwargs:
     80                     kwargs['sensor'] = dataset.attrs['sensor']

KeyError: 'start_time'

Environment Info:

  • OS: Linux
  • SatPy Version: 0.9.1a0.dev0
  • Python 3.6.4 (Conda 4.5.8)
@pnuu
Copy link
Member Author

pnuu commented Jul 30, 2018

@TAlonglong you might want to take a look at this, too.

@TAlonglong
Copy link
Collaborator

@pnuu ah ok. I have assumed filename is, yes a filename without directory. And that directory is set by save_dir. It looked to work for me when I submitted the PR.

I will try to have a look

@djhoese
Copy link
Member

djhoese commented Aug 4, 2018

@TAlonglong To match other writers this should use base_dir. Also you may want to merge with master where i fixed a bug recently for passing arguments to writers.

@djhoese
Copy link
Member

djhoese commented Oct 29, 2018

@pnuu @TAlonglong can you double check this with current master. I think with some edits I've made and some fixes that @TAlonglong this can be closed now. I also added some better handling for the base_dir and output filename pattern in #478.

@pnuu
Copy link
Member Author

pnuu commented Oct 30, 2018

Still fails, with another error. I use Meteosat data to test. Also, it seems that the directory I was going to save was discarded from the filename.

[DEBUG: 2018-10-30 07:05:50 : satpy.writers] Reading ['/home/lahtinep/Software/pytroll/packages/satpy/satpy/etc/writers/mitiff.yaml']
[DEBUG: 2018-10-30 07:05:51 : satpy.writers.mitiff] Starting in mitiff save_dataset ... 
[WARNING: 2018-10-30 07:05:51 : satpy.writers.mitiff] Unset save_dir. Use: ./
/home/lahtinep/Software/pytroll/packages/pyresample/pyresample/kd_tree.py:924: RuntimeWarning: invalid value encountered in sqrt
  mask=mask)
[DEBUG: 2018-10-30 07:05:52 : satpy.writers.mitiff] create_opts: ./
[WARNING: 2018-10-30 07:05:52 : satpy.writers.mitiff] Something went wrong with assigning to various dicts: 'metadata_requirements'
[WARNING: 2018-10-30 07:05:52 : satpy.writers.mitiff] Something went wrong with assigning to translate: 'metadata_requirements'
[DEBUG: 2018-10-30 07:05:52 : satpy.writers.mitiff] earliest start_time: 2018-04-11 11:30:10.927000
---------------------------------------------------------------------------
KeyError
Traceback (most recent call last)
<ipython-input-11-6639b069cb8b> in <module>()
----> 1 lcl.save_dataset(10.8, writer='mitiff', filename='/tmp/mitiff.tif')

~/Software/pytroll/packages/satpy/satpy/scene.py in save_dataset(self, dataset_id, filename, writer, overlay, compute, **kwargs)
   1052         return writer.save_dataset(self[dataset_id],
   1053                                    overlay=overlay, compute=compute,
-> 1054                                    **save_kwargs)
   1055 
   1056     def save_datasets(self, writer="geotiff", datasets=None, compute=True,

~/Software/pytroll/packages/satpy/satpy/writers/mitiff.py in save_dataset(self, dataset, filename, fill_value, compute, base_dir, **kwargs)
    116 
    117         if compute:
--> 118             return delayed.compute()
    119         return delayed
    120 

~/Software/miniconda3/lib/python3.6/site-packages/dask/base.py in compute(self, **kwargs)
    154         dask.base.compute
    155         """
--> 156         (result,) = compute(self, traverse=False, **kwargs)
    157         return result
    158 

~/Software/miniconda3/lib/python3.6/site-packages/dask/base.py in compute(*args, **kwargs)
    400     keys = [x.__dask_keys__() for x in collections]
    401     postcomputes = [x.__dask_postcompute__() for x in collections]
--> 402     results = schedule(dsk, keys, **kwargs)
    403     return repack([f(r, *a) for r, (f, a) in zip(results, postcomputes)])
    404 

~/Software/miniconda3/lib/python3.6/site-packages/dask/threaded.py in get(dsk, result, cache, num_workers, **kwargs)
     73     results = get_async(pool.apply_async, len(pool._pool), dsk, result,
     74                         cache=cache, get_id=_thread_get_id,
---> 75                         pack_exception=pack_exception, **kwargs)
     76 
     77     # Cleanup pools associated to dead threads

~/Software/miniconda3/lib/python3.6/site-packages/dask/local.py in get_async(apply_async, num_workers, dsk, result, cache, get_id, rerun_exceptions_locally, pack_exception, raise_exception, callbacks, dumps, loads, **kwargs)
    503                         _execute_task(task, data)  # Re-execute locally
    504                     else:
--> 505                         raise_exception(exc, tb)
    506                 res, worker_id = loads(res_info)
    507                 state['cache'][key] = res

~/Software/miniconda3/lib/python3.6/site-packages/dask/compatibility.py in reraise(exc, tb)
     67         if exc.__traceback__ is not tb:
     68             raise exc.with_traceback(tb)
---> 69         raise exc
     70 
     71 else:

~/Software/miniconda3/lib/python3.6/site-packages/dask/local.py in execute_task(key, task_info, dumps, loads, get_id, pack_exception)
    272     try:
    273         task, data = loads(task_info)
--> 274         result = _execute_task(task, data)
    275         id = get_id()
    276         result = dumps((result, id))

~/Software/miniconda3/lib/python3.6/site-packages/dask/local.py in _execute_task(arg, cache, dsk)
    253         func, args = arg[0], arg[1:]
    254         args2 = [_execute_task(a, cache) for a in args]
--> 255         return func(*args2)
    256     elif not ishashable(arg):
    257         return arg

~/Software/pytroll/packages/satpy/satpy/writers/mitiff.py in _delayed_create(create_opts, dataset)
     93                     LOG.warning("Something went wrong with assigning to translate: %s", ke)
     94 
---> 95                 image_description = self._make_image_description(dataset, **kwargs)
     96                 LOG.debug("File pattern %s", self.file_pattern)
     97                 self.filename_parser = self.create_filename_parser(create_opts)

~/Software/pytroll/packages/satpy/satpy/writers/mitiff.py in _make_image_description(self, datasets, **kwargs)
    538             _image_description += str(len(datasets))
    539         else:
--> 540             LOG.debug("len datasets: %s", datasets.sizes['bands'])
    541             _image_description += str(datasets.sizes['bands'])
    542 

~/Software/miniconda3/lib/python3.6/site-packages/xarray/core/utils.py in __getitem__(self, key)
    308 
    309     def __getitem__(self, key):
--> 310         return self.mapping[key]
    311 
    312     def __iter__(self):

KeyError: 'bands'

@TAlonglong
Copy link
Collaborator

Sorry for this. have not tested with meteosat data.

Will try to look into this later

@pnuu
Copy link
Member Author

pnuu commented Oct 30, 2018

Tested with 'natural_color' composite, and that works. So it seems that handling of single channel composites doesn't have the 'bands' information available.

@TAlonglong
Copy link
Collaborator

@pnuu can I ask you which composite you used to get the above mentioned error?

@pnuu
Copy link
Member Author

pnuu commented Oct 30, 2018

Just loaded channel 10.8 and tried to save that. So, yeah, it wasn't actually a composite..

@TAlonglong
Copy link
Collaborator

@pnuu I think I have solved this ('bands') problem now.

The next is 'base_dir' @djhoese. As far as I can see 'base_dir' is not propagated into the save_dataset in mitiff writer. If I use like this ql_re.save_dataset(composite_name, output_filename, writer='mitiff', base_dir='/data/pytroll/testdata') , the base_dir should be visible in the save_dataset in mitiff writer?

@djhoese
Copy link
Member

djhoese commented Oct 30, 2018

@TAlonglong You should be able to do filename = filename or self.get_filename(**data.attrs) which I think is how it is currently done. I think I updated the writer at one point to do this.

@djhoese
Copy link
Member

djhoese commented Oct 30, 2018

@pnuu When you say that the directory you were going to save to was discarded, do you mean base_dir?

@pnuu
Copy link
Member Author

pnuu commented Oct 30, 2018

I just saw this [WARNING: 2018-10-30 07:05:51 : satpy.writers.mitiff] Unset save_dir. Use: ./ in my failed 10.8 test. But in the natural_color test the image was saved where it was intended, so don't really know what that warning means.

@TAlonglong
Copy link
Collaborator

@pnuu, this warning is removed now. Sorry for the confusion.

@djhoese
Copy link
Member

djhoese commented Oct 30, 2018

So @TAlonglong you need to make a PR for the original issue and then we can close this?

@TAlonglong
Copy link
Collaborator

@djhoese but, hm .... how do I create a PR in this issue? Do I create a normal PR and somehow refer to this issue?

@djhoese
Copy link
Member

djhoese commented Oct 30, 2018

Yes, the latter. You create a PR and say "Fixes #XXX" where XXX is the number of this issue. The default PR template already has this as one of the check boxes.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Development

No branches or pull requests

4 participants