Skip to content

Commit

Permalink
Merge remote-tracking branch 'origin/PACO' into PACO
Browse files Browse the repository at this point in the history
  • Loading branch information
Evert Nasedkin committed Apr 29, 2020
2 parents 0ee0b2e + 270e8e8 commit 08c1a67
Show file tree
Hide file tree
Showing 9 changed files with 134 additions and 79 deletions.
2 changes: 2 additions & 0 deletions docs/coding.rst
Original file line number Diff line number Diff line change
Expand Up @@ -5,6 +5,8 @@ Coding a New Module

.. _constructor:

There are three different types of pipeline modules: :class:`~pynpoint.core.processing.ReadingModule`, :class:`~pynpoint.core.processing.WritingModule`, and :class:`~pynpoint.core.processing.ProcessingModule`. The concept is similar for these three modules so here we will explain only how to code a processing module.

Class Constructor
-----------------

Expand Down
2 changes: 1 addition & 1 deletion docs/pynpoint.processing.rst
Original file line number Diff line number Diff line change
Expand Up @@ -83,7 +83,7 @@ pynpoint.processing.limits module
:members:
:undoc-members:
:show-inheritance:

pynpoint.processing.pacoModule module
-------------------------------------

Expand Down
24 changes: 21 additions & 3 deletions docs/python.rst
Original file line number Diff line number Diff line change
Expand Up @@ -26,8 +26,6 @@ The modular architecture of PynPoint allows for easy implementation of new pipel

<a href="https://docs.python.org/3/library/abc.html" target="_blank">Abstract classes</a>

There are three different types of pipeline modules: :class:`~pynpoint.core.processing.ReadingModule`, :class:`~pynpoint.core.processing.WritingModule`, and :class:`~pynpoint.core.processing.ProcessingModule`. The concept is similar for the three types of modules so here we will explain only how to code a processing module.

.. _conventions:

Conventions
Expand All @@ -52,4 +50,24 @@ Before we start writing a new PynPoint module, please take notice of the followi

<a href="https://pypi.org/project/pycodestyle/" target="_blank">pycodestyle</a>

Now we are ready to code!
Unit tests
----------

PynPoint is a robust pipeline package with 95% of the code covered by |unittest|. Testing of the package is done by running ``make test`` in the cloned repository. This requires the installation of:

* |pytest|
* |pytest-cov|

The unit tests ensure that the output from existing functionalities will not change whenever new code. With these things in mind, we are now ready to code!

.. |unittest| raw:: html

<a href="https://docs.python.org/3/library/unittest.html" target="_blank">unit tests</a>

.. |pytest| raw:: html

<a href="https://docs.pytest.org/en/latest/getting-started.html" target="_blank">pytest</a>

.. |pytest-cov| raw:: html

<a href="https://pytest-cov.readthedocs.io/en/latest/readme.html" target="_blank">pytest-cov</a>
8 changes: 5 additions & 3 deletions pull_request_template.md
Original file line number Diff line number Diff line change
Expand Up @@ -3,7 +3,9 @@ Thank you for your contribution to the PynPoint repo! Before submitting this PR,
- [ ] To read the documentation page on the [Python guidelines](https://pynpoint.readthedocs.io/en/latest/python.html).
- [ ] That your branch of the PR is up to date with the master branch of the upstream PynPoint repo.
- [ ] To run both `pycodestyle` and `pylint` on the code that has been added and/or changed.
- [ ] That the documentation is successfully build after running `make docs` in your local repo folder.
- [ ] That all unit tests are finishing after running `make test` in your local repo folder.
- [ ] That the documentation is successfully build after running `make docs` in your local repo folder. This requires the installation of `sphinx` and `sphinx_rtd_theme`.
- [ ] That all unit tests are finishing after running `make test` in your local repo folder. This requires the installation of `pytest` and `pytest-cov`.
- [ ] To add unit tests in case there are new pipeline modules and/or functionalities added.
- [ ] Make sure that only text files have been added and changed in the commits of the PR. Binary files will clutter up the repo because even after removing such files they will remain in the repo history.
- [ ] That only text files have been added and changed in the commits of the PR. Binary files will clutter up the repo because even after removing such files they will remain in the repo history.
- [ ] To add and/or update the docstrings.
- [ ] To add and/or update the typehints and typechecks.
103 changes: 57 additions & 46 deletions pynpoint/processing/fluxposition.py
Original file line number Diff line number Diff line change
Expand Up @@ -19,7 +19,7 @@
from photutils.aperture import Aperture

from pynpoint.core.processing import ProcessingModule
from pynpoint.util.analysis import fake_planet, merit_function, false_alarm, gaussian_noise
from pynpoint.util.analysis import fake_planet, merit_function, false_alarm, pixel_variance
from pynpoint.util.image import create_mask, polar_to_cartesian, cartesian_to_polar, \
center_subpixel, rotate_coordinates
from pynpoint.util.mcmc import lnprob
Expand Down Expand Up @@ -158,8 +158,8 @@ def run(self) -> None:

class SimplexMinimizationModule(ProcessingModule):
"""
Pipeline module to measure the flux and position of a planet by injecting negative fake planets
and minimizing a figure of merit.
Pipeline module to retrieve the contrast and position of a planet by injecting negative
artificial planets and using a downhill simplex method.
"""

__author__ = 'Tomas Stolker'
Expand Down Expand Up @@ -213,11 +213,11 @@ def __init__(self,
Additional scaling factor of the planet flux (e.g., to correct for a neutral density
filter). Should be negative in order to inject negative fake planets.
merit : str
Figure of merit for the minimization. Can be 'hessian', to minimize the sum of the
absolute values of the determinant of the Hessian matrix, or 'poisson', to minimize the
sum of the absolute pixel values, assuming a Poisson distribution for the noise
(Wertz et al. 2017), or 'gaussian', to minimize the ratio of the squared pixel values
and the variance of the pixels within an annulus but excluding the aperture area.
Figure of merit for the minimization ('hessian', 'gaussian', or 'poisson'). Either the
determinant of the Hessian matrix is minimized ('hessian') or the flux of each pixel
('gaussian' or 'poisson'). For the latter case, the estimate noise is assumed to follow
a Poisson (see Wertz et al. 2017) or Gaussian distribution (see Wertz et al. 2017 and
Stolker et al. 2020).
aperture : float
Aperture radius (arcsec) at the position specified at *position*.
sigma : float
Expand Down Expand Up @@ -309,8 +309,8 @@ def __init__(self,
def run(self) -> None:
"""
Run method of the module. The position and contrast of a planet is measured by injecting
negative copies of the PSF template and applying a simplex method (Nelder-Mead) for
minimization of a figure of merit at the planet location.
negative copies of the PSF template and applying a downhill simplex method (Nelder-Mead)
for minimization of a figure of merit at the planet location.
Returns
-------
Expand Down Expand Up @@ -351,7 +351,7 @@ def _objective(arg: np.ndarray,
count: int,
n_components: int,
sklearn_pca: Optional[PCA],
noise: Optional[float]) -> float:
var_noise: Optional[float]) -> float:

pos_y = arg[0]
pos_x = arg[1]
Expand Down Expand Up @@ -406,7 +406,7 @@ def _objective(arg: np.ndarray,
merit=self.m_merit,
aperture=aperture,
sigma=self.m_sigma,
noise=noise)
var_noise=var_noise)

position = rotate_coordinates(center, (pos_y, pos_x), -self.m_extra_rot)

Expand Down Expand Up @@ -464,21 +464,23 @@ def _objective(arg: np.ndarray,

sklearn_pca.components_ = q_ortho.T

if self.m_merit in ('poisson', 'hessian'):
noise = None
if self.m_merit == 'poisson':
var_noise = None

elif self.m_merit == 'gaussian':
noise = gaussian_noise(images=images,
parang=parang,
cent_size=self.m_cent_size,
edge_size=self.m_edge_size,
pca_number=n_components,
residuals=self.m_residuals,
aperture=aperture)
elif self.m_merit in ['gaussian', 'hessian']:
var_noise = pixel_variance(var_type=self.m_merit,
images=images,
parang=parang,
cent_size=self.m_cent_size,
edge_size=self.m_edge_size,
pca_number=n_components,
residuals=self.m_residuals,
aperture=aperture,
sigma=self.m_sigma)

minimize(fun=_objective,
x0=np.array([pos_init[0], pos_init[1], self.m_magnitude]),
args=(i, n_components, sklearn_pca, noise),
args=(i, n_components, sklearn_pca, var_noise),
method='Nelder-Mead',
tol=None,
options={'xatol': self.m_tolerance, 'fatol': float('inf')})
Expand Down Expand Up @@ -620,7 +622,7 @@ def _snr_optimize(arg: np.ndarray) -> float:
size=self.m_aperture,
ignore=self.m_ignore)

return -snr
return -1.*snr

pixscale = self.m_image_in_port.get_attribute('PIXSCALE')
self.m_aperture /= pixscale
Expand Down Expand Up @@ -741,12 +743,11 @@ def __init__(self,
extra_rot : float
Additional rotation angle of the images (deg).
merit : str
Figure of merit that is used for the likelihood function ('gaussian' or 'poisson').
Pixels are assumed to be independent measurements which are expected to be equal to
zero in case the best-fit negative PSF template is injected. With 'gaussian', the
variance is estimated from the pixel values within an annulus at the separation of
the aperture (but excluding the pixels within the aperture). With 'poisson', a Poisson
distribution is assumed for the variance of each pixel value (see Wertz et al. 2017).
Figure of merit for the minimization ('hessian', 'gaussian', or 'poisson'). Either the
determinant of the Hessian matrix is minimized ('hessian') or the flux of each pixel
('gaussian' or 'poisson'). For the latter case, the estimate noise is assumed to follow
a Poisson (see Wertz et al. 2017) or Gaussian distribution (see Wertz et al. 2017 and
Stolker et al. 2020).
residuals : str
Method used for combining the residuals ('mean', 'median', 'weighted', or 'clipped').
Expand Down Expand Up @@ -854,17 +855,28 @@ def run(self) -> None:
elif isinstance(self.m_aperture, tuple):
aperture = (self.m_aperture[1], self.m_aperture[0], self.m_aperture[2]/pixscale)

print(f'Aperture position [x, y]: [{aperture[1]}, {aperture[0]}]')
print(f'Aperture radius (pix): {aperture[2]:.2f}')
print(f'Figure of merit: {self.m_merit}')

if self.m_merit == 'poisson':
noise = None
var_noise = None

elif self.m_merit in ['gaussian', 'hessian']:
var_noise = pixel_variance(var_type=self.m_merit,
images=images,
parang=parang,
cent_size=self.m_mask[0],
edge_size=self.m_mask[1],
pca_number=self.m_pca_number,
residuals=self.m_residuals,
aperture=aperture,
sigma=0.)

elif self.m_merit == 'gaussian':
noise = gaussian_noise(images=images,
parang=parang,
cent_size=self.m_mask[0],
edge_size=self.m_mask[1],
pca_number=self.m_pca_number,
residuals=self.m_residuals,
aperture=aperture)
if self.m_merit == 'gaussian':
print(f'Gaussian standard deviation (counts): {np.sqrt(var_noise):.2e}')
if self.m_merit == 'hessian':
print(f'Hessian standard deviation: {np.sqrt(var_noise):.2e}')

initial = np.zeros((self.m_nwalkers, ndim))

Expand Down Expand Up @@ -892,7 +904,7 @@ def run(self) -> None:
indices,
self.m_merit,
self.m_residuals,
noise]))
var_noise]))

sampler.run_mcmc(initial, self.m_nsteps, progress=True)

Expand Down Expand Up @@ -1089,12 +1101,11 @@ def __init__(self,
Additional scaling factor of the planet flux (e.g., to correct for a neutral density
filter). Should be a positive value.
merit : str
Figure of merit that is used for the likelihood function ('gaussian' or 'poisson').
Pixels are assumed to be independent measurements which are expected to be equal to
zero in case the best-fit negative PSF template is injected. With 'gaussian', the
variance is estimated from the pixel values within an annulus at the separation of
the aperture (but excluding the pixels within the aperture). With 'poisson', a Poisson
distribution is assumed for the variance of each pixel value (see Wertz et al. 2017).
Figure of merit for the minimization ('hessian', 'gaussian', or 'poisson'). Either the
determinant of the Hessian matrix is minimized ('hessian') or the flux of each pixel
('gaussian' or 'poisson'). For the latter case, the estimate noise is assumed to follow
a Poisson (see Wertz et al. 2017) or Gaussian distribution (see Wertz et al. 2017 and
Stolker et al. 2020).
aperture : float
Aperture radius (arcsec) that is used for measuring the figure of merit.
tolerance : float
Expand Down
7 changes: 6 additions & 1 deletion pynpoint/processing/resizing.py
Original file line number Diff line number Diff line change
Expand Up @@ -83,7 +83,12 @@ def run(self) -> None:

# Convert size parameter from arcseconds to pixels
pixscale = self.m_image_in_port.get_attribute('PIXSCALE')
print(f'New image size (arcsec) = {self.m_size}')
self.m_size = int(math.ceil(self.m_size / pixscale))
print(f'New image size (pixels) = {self.m_size}')

if self.m_center is not None:
print(f'New image center (x, y) = {self.m_center}')

# Crop images chunk by chunk
start_time = time.time()
Expand All @@ -100,7 +105,7 @@ def run(self) -> None:
self.m_image_out_port.append(images, data_dim=3)

# Save history and copy attributes
history = f'image size [pix] = {self.m_size}'
history = f'image size (pix) = {self.m_size}'
self.m_image_out_port.add_history('CropImagesModule', history)
self.m_image_out_port.copy_attributes(self.m_image_in_port)
self.m_image_out_port.close_port()
Expand Down
Loading

0 comments on commit 08c1a67

Please sign in to comment.