Skip to content

Commit

Permalink
Merge pull request #32 from HealthyPear/issue-23
Browse files Browse the repository at this point in the history
Updated documentation (Closes #23)
  • Loading branch information
HealthyPear committed Jan 14, 2020
2 parents e543885 + 4f6be8d commit 733d2bf
Show file tree
Hide file tree
Showing 6 changed files with 427 additions and 177 deletions.
18 changes: 12 additions & 6 deletions docs/conf.py
Original file line number Diff line number Diff line change
Expand Up @@ -49,6 +49,10 @@
"sphinx_automodapi.smart_resolver",
]

# sphinx_automodapi: avoid having methods and attributes of classes being shown
# multiple times.
numpydoc_show_class_members = False

# Add any paths that contain templates here, relative to this directory.
templates_path = ["_templates"]

Expand Down Expand Up @@ -94,13 +98,15 @@
todo_include_todos = True

autoclass_content = "both" # include both class docstring and __init__
autodoc_default_flags = [

autodoc_default_options = {
# Make sure that any autodoc declarations show the right members
"members",
"inherited-members",
"private-members",
"show-inheritance",
]
"members": True,
"inherited-members": True,
"private-members": True,
"show-inheritance": True,
}

autosummary_generate = True # Make _autosummary files and include them
napoleon_numpy_docstring = False # Force consistency, leave only Google
napoleon_use_rtype = False # More legible
Expand Down
98 changes: 89 additions & 9 deletions docs/index.rst
Original file line number Diff line number Diff line change
Expand Up @@ -9,46 +9,126 @@ What is protopipe?
`Protopipe` is a pipeline prototype for the `Cherenkov Telescope Array
<https://www.cta-observatory.org/>`_ (CTA) based on the `ctapipe
<https://cta-observatory.github.io/ctapipe/>`_ library.
The package is currently developped and tested at the CEA in the departement
The package is currently developed and tested at the CEA in the department
of astrophysics.

The pipeline provides scripts to:

* Process simtelarray files and write DL1 or DL2 tables
* Build regression or classification models with diagnostic plots
* Estimate the best cutoffs which gives the minimal sensitivy
* Estimate the best cutoffs which gives the minimal sensitivity
reachable in a given amount of time
* Produce instrument response functions (IRF), including sensitivity

In order to process a significant amount of events the use of the GRID is rapidly
mandatory. Some utility scripts to submits jobs on the GRID are provided on
the `GRID repository <https://drf-gitlab.cea.fr/CTA-Irfu/grid>`_.

.. warning::

| For the moment *protopipe* supports (and is tested with) simtel Monte Carlo files obtained with prod3b with only LSTCam and NectarCam cameras.
| Any other kind of camera could lead to a crash (see e.g. `this <https://github.com/cta-observatory/protopipe/issues/22>`_ open issue).
| Note that some generic La Palma files can contain FlashCam cameras.
.. warning::
This is not yet stable code, so expect large and rapid changes.

Installation
============
It is recommanded to build a new environment with conda.
You can follow the instructions on the `ctapipe installation`_ page.
In order to use protopipe the following modules are required:
* the `ctapipe`_ library
* the `gammapy`_ library
* the `pywi-cta`_ module

In order to install protopipe , try `python setup.py` in your conda environment.
Requirements
------------

The only requirement is an Anaconda installation which supports Python 3.


.. Note::

For a faster use, edit your preferred login script (e.g. ``.bashrc`` or
``.profile``) with a function that initializes the environment. The following
is a minimal example using Bash.

.. code-block:: bash
function protopipe_init() {
conda activate protopipe # Then activate the protopipe environment
export PROTOPIPE=$WHEREISPROTOPIPE/protopipe # A shortcut to the scripts folder
}
Instructions for basic users
----------------------------

If you are a basic user with no interest in developing *protopipe*, you can use
the latest released version that you can find
`here <https://github.com/cta-observatory/protopipe/releases>`__ as a compressed archive.

Steps for installation:

1. uncompress the file which is always called *protopipe-X.Y.Z* depending on version,
2. enter the folder ``cd protopipe-X.Y.Z``
3. create a dedicated environment with ``conda env create -f protopipe_environment.yml``
4. activate it with ``conda activate protopipe``
5. install *protopipe* itself with ``python setup.py install``.

Instructions for advanced users
-------------------------------

If you want to use *protopipe* and also contribute to its development, follow these steps:

1. Fork the official `repository <https://github.com/cta-observatory/protopipe>`_ has explained `here <https://help.github.com/en/articles/fork-a-repo>`__ (follow all the instructions)
2. now your local copy is linked to your remote repository (**origin**) and the official one (**upstream**)
3. execute points 3 and 4 in the instructions for basic users
4. install *protopipe* itself in developer mode with ``python setup.py develop``

When you want to fix a bug or develop something new:

1. update your **local** *master* branch with `git pull upstream master`
2. create and move to a new **local** branch from your **local** *master* with `git checkout -b your_branch`
3. develop inside it
4. push it to *origin*, thereby creating a copy of your branch also there
5. continue to develop and push until you feel ready
6. start a *pull request* using the web interface from *origin/your_branch* to *upstream/master*

1. wait for an outcome
2. if necessary, you can update or fix things in your branch because now everything is traced (**local/your_branch** --> **origin/your_branch** --> **pull request**)

.. Note::

If your developments take a relatively long time, consider to update periodically your **local** *master* branch.

If in doing this you see that the files on which you are working on have been modified *upstream*,

* move into your **local** branch,
* merge the new master into your branch ``git merge master``,
* resolve eventual conflicts
* push to origin

In this way, your pull request will be up-to-date with the master branch into which you want to merge your changes.
If your changes are relatively small and `you know what you are doing <https://www.atlassian.com/git/tutorials/merging-vs-rebasing>`_, you can use ``git rebase master``, instead of merging.

How to?
=======
For this pipeline prototype, in order to build an analysis to estimate
the performance of the instruments, a user will follows the following steps:

1. Energy estimator

* produce a table with gamma-ray image information with pipeline utilities (:ref:`pipeline`)
* build a model with mva utilities (:ref:`mva`)

2. Gamma hadron classifier

* produce tables of gamma-rays and hadrons with image informations with pipeline utilities (:ref:`pipeline`)
* build a model with mva utilities (:ref:`mva`)

3. DL2 production

* produce tables of gamma-rays, hadrons and electrons with event informations with pipeline utilities (:ref:`pipeline`)

4. Estimate performance of the instrument

* find the best cutoff in gammaness/score, to discriminate between signal
and background, as well as the angular cut to obtain the best sensitivity
for a given amount of observation time and a given template for the
Expand Down
1 change: 1 addition & 0 deletions docs/mva/index.rst
Original file line number Diff line number Diff line change
Expand Up @@ -257,6 +257,7 @@ Reference/API

.. automodapi:: protopipe.mva
:no-inheritance-diagram:
:skip: auc, roc_curve

.. _scikit-learn: https://scikit-learn.org/
.. _GridSearchCV: https://scikit-learn.org/stable/modules/grid_search.html
Expand Down
10 changes: 5 additions & 5 deletions docs/perf/index.rst
Original file line number Diff line number Diff line change
Expand Up @@ -263,14 +263,14 @@ Responses
---------

Effective area
~~~~~~~~~~~~~~
^^^^^^^^^^^^^^
The collection area, which is proportional to the gamma-ray efficiency
of detection, is computed as a function of the true energy. The events which
are considered are the one passing the threshold of the best cutoff plus
the angular cuts.

Energy migration matrix
~~~~~~~~~~~~~~~~~~~~~~~
^^^^^^^^^^^^^^^^^^^^^^^
The migration matrix, ratio of the reconstructed energy over the true energy
as a function of the true energy, is computed with the events passing the
threshold of the best cutoff plus the angular cuts.
Expand All @@ -279,7 +279,7 @@ the sensitvity we artificially created fake offset bins.
I guess that Gammapy_ should be able to reaf IRF with single offset.

Background
~~~~~~~~~~
^^^^^^^^^^
The question to consider whether the bakground is an IRF or not. Since here it
is needed to estimate the sensitivity of the instrument we consider it is included
in the IRFs.
Expand All @@ -289,7 +289,7 @@ The events which are considered are the one passing the threshold of
the best cutoff and the angular cuts.

Point spread function
~~~~~~~~~~~~~~~~~~~~~
^^^^^^^^^^^^^^^^^^^^^
Here we do not really need the PSF to compute the sensitivity, since the angular
cuts are already applied to the effective area, the energy migration matrix
and the background.
Expand All @@ -303,7 +303,7 @@ there are multiple solutions
(see `here, <https://gamma-astro-data-formats.readthedocs.io/en/latest/irfs/full_enclosure/psf/index.html>`_).

Angular cut values
~~~~~~~~~~~~~~~~~~
^^^^^^^^^^^^^^^^^^
To be implemented: `<https://gamma-astro-data-formats.readthedocs.io/en/latest/irfs/point_like/index.html>`_

Sensitivity
Expand Down
Loading

0 comments on commit 733d2bf

Please sign in to comment.