Skip to content

Commit

Permalink
Merge pull request #404 from tomasstolker/ifs_data
Browse files Browse the repository at this point in the history
Compatibility of dataio and FitsReadingModule with IFS data
  • Loading branch information
Tomas Stolker committed Feb 27, 2020
2 parents a7f7ebe + d951c58 commit 6be597d
Show file tree
Hide file tree
Showing 19 changed files with 1,008 additions and 687 deletions.
9 changes: 8 additions & 1 deletion .readthedocs.yml
Original file line number Diff line number Diff line change
@@ -1,5 +1,12 @@
version: 2

sphinx:
configuration: docs/conf.py

build:
image: latest

python:
version: 3.6
version: 3.7
install:
- requirements: requirements.txt
11 changes: 6 additions & 5 deletions docs/overview.rst
Original file line number Diff line number Diff line change
Expand Up @@ -15,8 +15,9 @@ Reading Modules

* :class:`~pynpoint.readwrite.fitsreading.FitsReadingModule`: Import FITS files and relevant header information into the database.
* :class:`~pynpoint.readwrite.hdf5reading.Hdf5ReadingModule`: Import datasets and attributes from an HDF5 file (as created by PynPoint).
* :class:`~pynpoint.readwrite.textreading.ParangReadingModule`: Import a list of parallactic angles as dataset attribute.
* :class:`~pynpoint.readwrite.textreading.AttributeReadingModule`: Import a list of values as dataset attribute.
* :class:`~pynpoint.readwrite.attr_reading.AttributeReadingModule`: Import a list of values as dataset attribute.
* :class:`~pynpoint.readwrite.attr_reading.ParangReadingModule`: Import a list of parallactic angles as dataset attribute.
* :class:`~pynpoint.readwrite.attr_reading.WavelengthReadingModule`: Import a list of calibrated wavelengths as dataset attribute.
* :class:`~pynpoint.readwrite.nearreading.NearReadingModule` (CPU): Import VLT/VISIR data for the NEAR experiment.

.. _writemodule:
Expand All @@ -27,8 +28,8 @@ Writing Modules
* :class:`~pynpoint.readwrite.fitswriting.FitsWritingModule`: Export a dataset from the database to a FITS file.
* :class:`~pynpoint.readwrite.hdf5writing.Hdf5WritingModule`: Export part of the database to a new HDF5 file.
* :class:`~pynpoint.readwrite.textwriting.TextWritingModule`: Export a dataset to an ASCII file.
* :class:`~pynpoint.readwrite.textwriting.ParangWritingModule`: Export the parallactic angles of a dataset to an ASCII file.
* :class:`~pynpoint.readwrite.textwriting.AttributeWritingModule`: Export a list of attribute values to an ASCII file.
* :class:`~pynpoint.readwrite.attr_writing.AttributeWritingModule`: Export a list of attribute values to an ASCII file.
* :class:`~pynpoint.readwrite.attr_writing.ParangWritingModule`: Export the parallactic angles of a dataset to an ASCII file.

.. _procmodule:

Expand Down Expand Up @@ -158,4 +159,4 @@ Stacking
* :class:`~pynpoint.processing.stacksubset.CombineTagsModule`: Combine multiple database tags into a single dataset.

.. note::
The pipeline modules with multiprocessing functionalities are indicated with "CPU" in parentheses. The number of parallel processes can be set with the ``CPU`` parameter in the central configuration file and the number of images that is simultaneously loaded into the memory with the ``MEMORY`` parameter. Pipeline modules that apply (in parallel) a function to subsets of images use a number of images per subset equal to ``MEMORY`` divided by ``CPU``.
The pipeline modules with multiprocessing functionalities are indicated with "CPU" in parentheses. The number of parallel processes can be set with the ``CPU`` parameter in the central configuration file and the number of images that is simultaneously loaded into the memory with the ``MEMORY`` parameter. Pipeline modules that apply (in parallel) a function to subsets of images use a number of images per subset equal to ``MEMORY`` divided by ``CPU``.
24 changes: 16 additions & 8 deletions docs/pynpoint.readwrite.rst
Original file line number Diff line number Diff line change
Expand Up @@ -4,6 +4,22 @@ pynpoint.readwrite package
Submodules
----------

pynpoint.readwrite.attr\_reading module
---------------------------------------

.. automodule:: pynpoint.readwrite.attr_reading
:members:
:undoc-members:
:show-inheritance:

pynpoint.readwrite.attr\_writing module
---------------------------------------

.. automodule:: pynpoint.readwrite.attr_writing
:members:
:undoc-members:
:show-inheritance:

pynpoint.readwrite.fitsreading module
-------------------------------------

Expand Down Expand Up @@ -44,14 +60,6 @@ pynpoint.readwrite.nearreading module
:undoc-members:
:show-inheritance:

pynpoint.readwrite.textreading module
-------------------------------------

.. automodule:: pynpoint.readwrite.textreading
:members:
:undoc-members:
:show-inheritance:

pynpoint.readwrite.textwriting module
-------------------------------------

Expand Down
14 changes: 8 additions & 6 deletions pynpoint/__init__.py
Original file line number Diff line number Diff line change
Expand Up @@ -78,6 +78,13 @@
WaveletTimeDenoisingModule, \
TimeNormalizationModule

from pynpoint.readwrite.attr_reading import AttributeReadingModule, \
ParangReadingModule, \
WavelengthReadingModule

from pynpoint.readwrite.attr_writing import AttributeWritingModule, \
ParangWritingModule

from pynpoint.readwrite.fitsreading import FitsReadingModule

from pynpoint.readwrite.fitswriting import FitsWritingModule
Expand All @@ -86,12 +93,7 @@

from pynpoint.readwrite.hdf5writing import Hdf5WritingModule

from pynpoint.readwrite.textwriting import AttributeWritingModule, \
ParangWritingModule, \
TextWritingModule

from pynpoint.readwrite.textreading import ParangReadingModule, \
AttributeReadingModule
from pynpoint.readwrite.textwriting import TextWritingModule

from pynpoint.readwrite.nearreading import NearReadingModule

Expand Down
4 changes: 4 additions & 0 deletions pynpoint/core/attributes.py
Original file line number Diff line number Diff line change
Expand Up @@ -89,6 +89,10 @@ def get_attributes():
'config': 'header',
'value': 'None',
'type': 'float'},
'WAVELENGTH': {'attribute': 'non-static',
'config': 'header',
'value': 'None',
'type': 'float'},
'STAR_POSITION': {'attribute': 'non-static',
'config': None,
'value': None,
Expand Down
49 changes: 37 additions & 12 deletions pynpoint/core/dataio.py
Original file line number Diff line number Diff line change
Expand Up @@ -689,7 +689,7 @@ def _initialize_database(self,
tag : str
Database tag.
data_dim : int
Number of dimensions. The dimensions of *first_data* is used if set to None.
Number of dimensions. The dimensions of ``first_data`` is used if set to None.
Returns
-------
Expand All @@ -698,12 +698,12 @@ def _initialize_database(self,
"""

def _ndim_check(data_dim, first_dim):
if first_dim > 3 or first_dim < 1:
raise ValueError('Output port can only save numpy arrays from 1D to 3D. Use Port '
if first_dim > 4 or first_dim < 1:
raise ValueError('Output port can only save numpy arrays from 1D to 4D. Use Port '
'attributes to save as int, float, or string.')

if data_dim > 3 or data_dim < 1:
raise ValueError('The data dimensions should be 1D, 2D, or 3D.')
if data_dim > 4 or data_dim < 1:
raise ValueError('The data dimensions should be 1D, 2D, 3D, or 4D.')

if data_dim < first_dim:
raise ValueError('The dimensions of the data should be equal to or larger than the '
Expand All @@ -720,24 +720,31 @@ def _ndim_check(data_dim, first_dim):
_ndim_check(data_dim, first_data.ndim)

if data_dim == first_data.ndim:
if first_data.ndim == 1: # case (1,1)
if first_data.ndim == 1: # 1D
data_shape = (None, )

elif first_data.ndim == 2: # case (2,2)
elif first_data.ndim == 2: # 2D
data_shape = (None, first_data.shape[1])

elif first_data.ndim == 3: # case (3,3)
elif first_data.ndim == 3: # 3D
data_shape = (None, first_data.shape[1], first_data.shape[2])

elif first_data.ndim == 4: # 4D
data_shape = (first_data.shape[0], None, first_data.shape[2], first_data.shape[3])

else:
if data_dim == 2: # case (2,1)
if data_dim == 2: # 1D -> 2D
data_shape = (None, first_data.shape[0])
first_data = first_data[np.newaxis, :]

elif data_dim == 3: # case (3,2)
elif data_dim == 3: # 2D -> 3D
data_shape = (None, first_data.shape[0], first_data.shape[1])
first_data = first_data[np.newaxis, :, :]

elif data_dim == 4: # 3D -> 4D
data_shape = (first_data.shape[0], None, first_data.shape[1], first_data.shape[2])
first_data = first_data[:, np.newaxis, :, :]

if first_data.size == 0:
warnings.warn(f'The new dataset that is stored under the tag name \'{tag}\' is empty.')

Expand Down Expand Up @@ -847,6 +854,8 @@ def _append_key(self,
data = data[np.newaxis, :]
elif data_dim == 3:
data = data[np.newaxis, :, :]
elif data_dim == 4:
data = data[:, np.newaxis, :, :]

def _type_check():
check_result = False
Expand All @@ -855,12 +864,21 @@ def _type_check():

if tmp_dim == 1:
check_result = True

elif tmp_dim == 2:
check_result = tmp_shape[1] == data.shape[1]

elif tmp_dim == 3:
# check if the spatial shape is the same
check_result = (tmp_shape[1] == data.shape[1]) and \
(tmp_shape[2] == data.shape[2])

elif tmp_dim == 4:
# check if the spectral and spatial shape is the same
check_result = (tmp_shape[0] == data.shape[0]) and \
(tmp_shape[2] == data.shape[2]) and \
(tmp_shape[3] == data.shape[3])

return check_result

if _type_check():
Expand All @@ -876,8 +894,15 @@ def _type_check():
if isinstance(data[0], str):
data = np.array(data, dtype='|S')

self._m_data_storage.m_data_bank[tag].resize(tmp_shape[0] + data.shape[0], axis=0)
self._m_data_storage.m_data_bank[tag][tmp_shape[0]::] = data
if data.ndim == 4:
# IFS data: (n_wavelength, n_dit, y_pos, x_pos)
self._m_data_storage.m_data_bank[tag].resize(tmp_shape[1] + data.shape[1], axis=1)
self._m_data_storage.m_data_bank[tag][:, tmp_shape[1]:, :, :] = data

else:
# Other data: n_dit is the first dimension
self._m_data_storage.m_data_bank[tag].resize(tmp_shape[0] + data.shape[0], axis=0)
self._m_data_storage.m_data_bank[tag][tmp_shape[0]:, ] = data

return None

Expand Down
13 changes: 13 additions & 0 deletions pynpoint/core/pypeline.py
Original file line number Diff line number Diff line change
Expand Up @@ -3,9 +3,11 @@
"""

import os
import json
import warnings
import configparser
import collections
import urllib.request
import multiprocessing

import h5py
Expand Down Expand Up @@ -63,6 +65,17 @@ def __init__(self,
print(pynpoint_version)
print(len(pynpoint_version) * '=' + '\n')

try:
contents = urllib.request.urlopen('https://pypi.org/pypi/pynpoint/json').read()
data = json.loads(contents)
latest_version = data['info']['version']

except urllib.error.URLError:
latest_version = None

if latest_version is not None and pynpoint.__version__ != latest_version:
print(f'A new version ({latest_version}) is available!\n')

self._m_working_place = working_place_in
self._m_input_place = input_place_in
self._m_output_place = output_place_in
Expand Down

0 comments on commit 6be597d

Please sign in to comment.