Skip to content

Commit

Permalink
Merge pull request #457 from cdeil/obs
Browse files Browse the repository at this point in the history
Improve data and observation handling
  • Loading branch information
cdeil committed Mar 15, 2016
2 parents 625642f + a0930c0 commit 4a92be9
Show file tree
Hide file tree
Showing 38 changed files with 1,041 additions and 773 deletions.
4 changes: 3 additions & 1 deletion docs/background/make_models.rst
Expand Up @@ -119,7 +119,9 @@ The following plots are produced with a modified version of the
:download:`wip_bg_cube_model_comparison.py
<../../examples/wip_bg_cube_model_comparison.py>` example script:

.. plot:: background/plot_bgcube_true_reco.py
TODO: remove or fix these examples

.. .. plot:: background/plot_bgcube_true_reco.py
The input counts spectrum is a power-law with an index of 1.5, in
order to have some counts at high energies with a reasonable amount
Expand Down
2 changes: 1 addition & 1 deletion docs/background/plot_bgcube.py
Expand Up @@ -6,7 +6,7 @@
from gammapy.datasets import gammapy_extra

filename = gammapy_extra.filename('test_datasets/background/bg_cube_model_test1.fits')
bg_cube_model = Cube.read(filename, format='table', scheme='bg_cube')
bg_cube_model = Cube.read(filename, format='table', scheme='bg_cube', hdu='BACKGROUND')

fig, axes = plt.subplots(nrows=1, ncols=3)
fig.set_size_inches(16, 5., forward=True)
Expand Down
16 changes: 11 additions & 5 deletions docs/data/dm.rst
Expand Up @@ -52,26 +52,32 @@ Exploring the data using the DataStore class works like this
base_dir: hess-crab4
observations: 4
files: 16
>>> data_store.filename(obs_id=23592, filetype='events')
>>> data_store.obs(obs_id=23592).location(hdu_class='events').path(abs_path=False)
'hess-crab4/hess_events_simulated_023592.fits'
In addition, the DataStore class has convenience properties and methods that
actually load the data and IRFs and return objects of the appropriate class

.. code-block:: python
>>> aeff2d = data_store.load(obs_id=23592, filetype='aeff')
>>> event_list = data_store.obs(obs_id=23592).events
>>> type(event_list)
TODO
>>> aeff2d = data_store.obs(obs_id=23592).aeff
>>> type(aeff2d)
<class 'gammapy.irf.effective_area_table.EffectiveAreaTable2D'>
>>> event_list = data_store.load(obs_id=23592, filetype='events')
>>> event_list.target_radec
>>> obs.target_radec
<SkyCoord (FK5: equinox=J2000.000): (ra, dec) in deg
(83.63333333, 22.01444444)>
Data Manager
++++++++++++

The data access is even more convenient with a DataManager. It is based one a data registry config file (YAML format) that specifies where data and index files are located on the user's machine. In other words, the data registry is a list of datastores that can be accessed by name. By default, Gammapy looks for data registry config files called ``data-register.yaml`` in the ``~/.gammapy`` folder. Thus, put the following in ``~/.gammapy/data-register.yaml`` in order to proceed with the example.
The data access is even more convenient with a DataManager.It is based one a data registry config file (YAML format)
that specifies where data and index files are located on the user's machine. In other words, the data registry is
a list of datastores that can be accessed by name. By default, Gammapy looks for data registry config files called
``data-register.yaml`` in the ``~/.gammapy`` folder. Thus, put the following in ``~/.gammapy/data-register.yaml``
in order to proceed with the example.

.. include:: ./example-data-register.yaml
:code: yaml
Expand Down
63 changes: 14 additions & 49 deletions docs/data/index.rst
Expand Up @@ -11,64 +11,29 @@ Data and observation handling (`gammapy.data`)
Introduction
============

`gammapy.data` contains classes to represent gamma-ray data.
`gammapy.data` currently contains the `~gammapy.data.EventList` class,
as well as classes for IACT data and observation handling.

We follow the Fermi data model and FITS formats as much as possible.

The data format used is FITS and we define one container class to represent
each FITS extension. In addition we define two high-level dataset classes
that group all info data and metadata that's usually given for a single
observation together:

* Unbinned data is represented by a `~gammapy.data.EventListDataset`, which contains:

- `~gammapy.data.EventList` - table with time, position and energy for each event
- `~gammapy.data.GoodTimeIntervals` - table of good time intervals.
Used for livetime and exposure computation.
- `~gammapy.data.TelescopeArray` - some info about the array that took the data.
Optional, not used at the moment.

* For most analysis the first step is to bin the data, which turns the
`~gammapy.data.EventListDataset` into one of:

- `~gammapy.data.CountsCubeDataset` (lon, lat, energy)
- `~gammapy.data.CountsSpectrum` (energy)
- `~gammapy.data.CountsImageDataset` (lon, lat)
- `~gammapy.data.CountsLightCurveDataset` (time)

* TODO: add IRFs to the dataset classes?

* TODO: do we need the ``*Dataset`` wrappers or can we only have
``EventList``, ``CountsCube``, ``CountsSpectrum``, ``CountsImage``, ...?

* We'll have to see if we want to copy the IRF and / or GTI and / or TelescopeArray info
over to the binned dataset classes ... at the moment it's not clear if we need that info.

Energy binning
--------------
Getting Started
===============

* `~gammapy.spectrum.Energy` and FITS "ENERGIES" extensions.
* `~gammapy.spectrum.EnergyBounds` and FITS "EBOUNDS" extensions.
You can use the `~gammapy.data.EventList` class to load gamma-ray event lists:

Spatial binning
---------------
.. code-block:: python
TODO: Should we define a "SpatialGrid" or "SpatialBinning" class that wraps
the 2d image FITS WCS and add convenience methods like generating them from scratch
e.g. centered on the target or aligned to a survey map grid?
>>> from gammapy.data import EventListDataset
>>> filename = '$GAMMAPY_EXTRA/datasets/vela_region/events_vela.fits'
>>> events = EventListDataset.read(filename)
Getting Started
===============
TODO: ``events.info()`` gives s ``KeyError: 'ONTIME'``.
Should we introduce a sub-class ``EventListIACT``?

.. code-block:: python
>>> from gammapy.data import EventListDataset
>>> events_ds = EventListDataset.read('events.fits')
>>> events_ds.info()
>>> from gammapy.data import CountsCubeDataset
# This is just an idea ... not implemented!
>>> counts_ds = CountsCubeDataset.from_events(events_ds, ...)
>>> counts_ds = events_ds.make_counts_cube(...)
>>> filename = '$GAMMAPY_EXTRA/datasets/hess-crab4-hd-hap-prod2/run023400-023599/run023523/hess_events_023523.fits.gz'
>>> events = EventListDataset.read(filename)
>>> events.info()
Using `gammapy.data`
====================
Expand Down
2 changes: 1 addition & 1 deletion docs/development/howto.rst
Expand Up @@ -604,7 +604,7 @@ As an example see `gammapy.data.EventList.radec`, which is reproduced here:
"""Event RA / DEC sky coordinates (`~astropy.coordinates.SkyCoord`).
"""
lon, lat = self['RA'], self['DEC']
return SkyCoord(lon, lat, unit='deg', frame='fk5')
return SkyCoord(lon, lat, unit='deg', frame='icrs')
Class attributes
Expand Down
6 changes: 3 additions & 3 deletions examples/example_cube_npred.py
@@ -1,10 +1,10 @@
"""Test npred model image computation.
"""
from astropy.units import Quantity
from astropy.coordinates import Angle
from gammapy.datasets import FermiGalacticCenter
from gammapy.utils.energy import EnergyBounds
from gammapy.irf import EnergyDependentTablePSF
from gammapy.data import (SpectralCube,
from gammapy.cube import (SpectralCube,
compute_npred_cube,
convolve_cube)

Expand All @@ -15,7 +15,7 @@

spectral_cube = spectral_cube.reproject_to(exposure_cube)

energy_bounds = Quantity([10, 30, 100, 500], 'GeV')
energy_bounds = EnergyBounds([10, 30, 100, 500], 'GeV')
npred_cube = compute_npred_cube(spectral_cube,
exposure_cube,
energy_bounds)
Expand Down
26 changes: 12 additions & 14 deletions examples/example_iact_data_peek.py
Expand Up @@ -12,23 +12,22 @@
fig, axes = plt.subplots(2, 4, figsize=(20, 8))
axes = axes.flat

ds = DataStore.from_dir('$GAMMAPY_EXTRA/datasets/hess-crab4-hd-hap-prod2')
data_store = DataStore.from_dir('$GAMMAPY_EXTRA/datasets/hess-crab4-hd-hap-prod2')
obs = data_store.obs(obs_id=23523)

events = ds.load(obs_id=23523, filetype='events')
events.peek()
obs.events.peek()

aeff2d = ds.load(obs_id=23523, filetype='aeff')
aeff2d.plot_energy_dependence(ax=next(axes))
aeff2d.plot_offset_dependence(ax=next(axes))
obs.aeff.plot_energy_dependence(ax=next(axes))
obs.aeff.plot_offset_dependence(ax=next(axes))

aeff = aeff2d.to_effective_area_table(offset='1 deg')
aeff.plot_area_vs_energy(ax=next(axes))
aeff = obs.aeff.to_effective_area_table(offset='1 deg')
# import IPython; IPython.embed()
aeff.plot(ax=next(axes))

edisp2d = ds.load(obs_id=23523, filetype='edisp')
edisp2d.plot_bias(ax=next(axes))
edisp2d.plot_migration(ax=next(axes))
obs.edisp.plot_bias(ax=next(axes))
obs.edisp.plot_migration(ax=next(axes))

edisp = edisp2d.to_energy_dispersion(offset='1 deg')
edisp = obs.edisp.to_energy_dispersion(offset='1 deg')
edisp.plot_matrix(ax=next(axes))
# TODO: bias plot not implemented yet
# edisp.plot_bias(ax=next(axes))
Expand All @@ -38,8 +37,7 @@
# psf = irf.EnergyDependentMultiGaussPSF.read(filename)
# psf.plot_containment(ax=next(axes))

psf = ds.load(obs_id=23523, filetype='psf')
psf.plot_containment(0.68, show_safe_energy=False, ax=next(axes))
obs.psf.plot_containment(0.68, show_safe_energy=False, ax=next(axes))

# TODO: hess_bkg_offruns_023523.fits.gz

Expand Down
52 changes: 0 additions & 52 deletions examples/example_obs_select.py

This file was deleted.

25 changes: 12 additions & 13 deletions examples/test_curve.py
Expand Up @@ -15,13 +15,13 @@
from gammapy.region import SkyCircleRegion
from gammapy.stats import significance
# from gammapy.detect import compute_ts_map
import pylab as pt
import matplotlib.pyplot as plt

pt.ion()
plt.ion()


def make_excluded_sources():
#centers = SkyCoord([84, 82], [23, 21], unit='deg')
# centers = SkyCoord([84, 82], [23, 21], unit='deg')
centers = SkyCoord([83.63, 83.63], [22.01, 22.01], unit='deg', frame='icrs')
radius = Angle('0.3 deg')
sources = SkyCircleRegion(pos=centers, radius=radius)
Expand All @@ -43,7 +43,7 @@ def make_model():

multi_array = EnergyOffsetBackgroundModel(ebounds, offset)
multi_array.fill_obs(obs_table, data_store, excluded_sources)
#multi_array.fill_obs(obs_table, data_store)
# multi_array.fill_obs(obs_table, data_store)
multi_array.compute_rate()
bgarray = multi_array.bg_rate
energy_range = Energy([1, 10], 'TeV')
Expand All @@ -70,7 +70,7 @@ def make_image():

counts_image.data += bin_events_in_image(events, counts_image).data

#interp_param = dict(bounds_error=False, fill_value=None)
# interp_param = dict(bounds_error=False, fill_value=None)

acc_hdu = fill_acceptance_image(bkg_image.header, center, table["offset"], table["Acceptance"])
acc = Quantity(acc_hdu.data, table["Acceptance"].unit) * solid_angle * livetime
Expand All @@ -94,20 +94,19 @@ def make_significance_image():
s_image.writeto("significance_image.fits", clobber=True)



def plot_model():
multi_array = EnergyOffsetBackgroundModel.read('energy_offset_array.fits')
table = Table.read('acceptance_curve.fits')
pt.figure(1)
plt.figure(1)
multi_array.counts.plot_image()
pt.figure(2)
plt.figure(2)
multi_array.livetime.plot_image()
pt.figure(3)
plt.figure(3)
multi_array.bg_rate.plot_image()
pt.figure(4)
pt.plot(table["offset"], table["Acceptance"])
pt.xlabel("offset (deg)")
pt.ylabel("Acceptance (s-1 sr-1)")
plt.figure(4)
plt.plot(table["offset"], table["Acceptance"])
plt.xlabel("offset (deg)")
plt.ylabel("Acceptance (s-1 sr-1)")

input()

Expand Down

0 comments on commit 4a92be9

Please sign in to comment.