Skip to content

Commit

Permalink
Merge branch 'planck_gnilc_2016' into develop
Browse files Browse the repository at this point in the history
  • Loading branch information
gregreen committed Apr 14, 2021
2 parents b0336ae + 6e55c44 commit b8df181
Show file tree
Hide file tree
Showing 8 changed files with 202 additions and 52 deletions.
24 changes: 14 additions & 10 deletions README.md
Original file line number Diff line number Diff line change
Expand Up @@ -13,15 +13,16 @@ The currently supported dust maps are:

1. Burstein & Heiles (1982; BH'82)
2. Chen et al. (2014)
3. Green, Schlafly, Finbeiner et al. (2015,2018,2019; Bayestar)
3. Green, Schlafly, Finkbeiner et al. (2015,2018,2019; Bayestar)
4. Marshall et al. (2006)
5. Planck Collaboration (2013)
6. Sale et al. (2014; IPHAS)
7. Schlegel, Finkbeiner & Davis (1998; SFD'98)
8. Lenz, Hensley & Doré (2017)
9. Peek & Graves (2010)
10. Leike & Enßlin (2019)
11. Leike, Glatzle & Enßlin (2020)
6. Planck Collaboration (2016; GNILC)
7. Sale et al. (2014; IPHAS)
8. Schlegel, Finkbeiner & Davis (1998; SFD'98)
9. Lenz, Hensley & Doré (2017)
10. Peek & Graves (2010)
11. Leike & Enßlin (2019)
12. Leike, Glatzle & Enßlin (2020)

To request addition of another dust map in this package, [file an issue on
GitHub](https://github.com/gregreen/dustmaps/issues), or submit a pull request.
Expand All @@ -47,9 +48,9 @@ To fetch the data for the SFD dust map, run:

python setup.py fetch --map-name=sfd

You can download the other dust maps by changing "sfd" to "planck", "bayestar",
"iphas", "marshall", "chen2014", "lenz2017", "pg2010", "leikeensslin2019",
"leike2020" or "bh".
You can download the other dust maps by changing "sfd" to "planck",
"planckGNILC", "bayestar", "iphas", "marshall", "chen2014", "lenz2017",
"pg2010", "leikeensslin2019", "leike2020" or "bh".

Alternatively, if you have used `pip` to install `dustmaps`, then you can
configure the data directory and download the data by opening up a python
Expand All @@ -64,6 +65,9 @@ interpreter and running:
>>> import dustmaps.planck
>>> dustmaps.planck.fetch()
>>>
>>> import dustmaps.planck
>>> dustmaps.planck.fetch(which='GNILC')
>>>
>>> import dustmaps.bayestar
>>> dustmaps.bayestar.fetch()
>>>
Expand Down
4 changes: 4 additions & 0 deletions docs/installation.rst
Original file line number Diff line number Diff line change
Expand Up @@ -29,6 +29,9 @@ Start up a python interpreter and type:
import dustmaps.planck
dustmaps.planck.fetch()
import dustmaps.planck
dustmaps.planck.fetch(which='GNILC')
import dustmaps.bayestar
dustmaps.bayestar.fetch()
Expand Down Expand Up @@ -102,6 +105,7 @@ to only download those you think you'll need:
python setup.py fetch --map-name=sfd
python setup.py fetch --map-name=planck
python setup.py fetch --map-name=planckGNILC
python setup.py fetch --map-name=bayestar
python setup.py fetch --map-name=iphas
python setup.py fetch --map-name=marshall
Expand Down
17 changes: 15 additions & 2 deletions docs/maps.rst
Original file line number Diff line number Diff line change
Expand Up @@ -37,8 +37,8 @@ the HI4PI Survey. This map reports E(B-V) in magnitudes.
* **See also**: `GitHub page <https://github.com/daniellenz/ebv_tools>`_.


Planck
~~~~~~
Planck Collaboration (2013)
~~~~~~~~~~~~~~~~~~~~~~~~~~~

Two-dimensional maps of dust reddening across the entire sky. The
`Planck Collaboration (2013) <http://adsabs.harvard.edu/abs/2014A%26A...571A..11P>`_
Expand All @@ -58,6 +58,19 @@ are based on:
* **Website**: `Planck Explanatory Supplement <https://wiki.cosmos.esa.int/planckpla/index.php/CMB_and_astrophysical_component_maps#The_.5Bmath.5DE.28B-V.29.5B.2Fmath.5D_map_for_extra-galactic_studies>`_


Planck Collaboration (2016; "GNILC")
~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~

Two-dimensional maps of dust reddening across the entire sky, using the
generalized needlet internal linear combination (GNILC) method to separate
out Galactic dust emission from CIB anisotropies.

This map contains both reddening estimates and estimated uncertainties.

* **Reference**: `Planck Collaboration (2016) <https://ui.adsabs.harvard.edu/abs/2016A%26A...596A.109P/abstract>`_
* **Website**: `Planck Explanatory Supplement <https://wiki.cosmos.esa.int/planck-legacy-archive/index.php/Foreground_maps#GNILC_thermal_dust_and_CIB_products>`_


Peek & Graves (2010)
~~~~~~~~~~~~~~~~~~~~

Expand Down
4 changes: 2 additions & 2 deletions docs/modules.rst
Original file line number Diff line number Diff line change
Expand Up @@ -68,8 +68,8 @@ pg2010 (Peek & Graves 2010)
:special-members:
:show-inheritance:

planck (Planck Collaboration 2013)
----------------------------------
planck (Planck Collaboration 2013, 2016)
----------------------------------------
.. automodule:: dustmaps.planck
:members:
:special-members:
Expand Down
24 changes: 19 additions & 5 deletions dustmaps/healpix_map.py
Original file line number Diff line number Diff line change
Expand Up @@ -23,6 +23,7 @@
from __future__ import print_function, division
import six

import numpy as np
import healpy as hp
import astropy.io.fits as fits

Expand Down Expand Up @@ -71,7 +72,8 @@ class HEALPixFITSQuery(HEALPixQuery):
A HEALPix map class that is initialized from a FITS file.
"""

def __init__(self, fname, coord_frame, hdu=0, field=None, dtype='f8'):
def __init__(self, fname, coord_frame, hdu=0, field=None,
dtype='f8', scale=None):
"""
Args:
fname (str, HDUList, TableHDU or BinTableHDU): The filename, HDUList
Expand All @@ -85,8 +87,11 @@ def __init__(self, fname, coord_frame, hdu=0, field=None, dtype='f8'):
the map from. Defaults to ``None``, meaning that ``hdu.data[:]``
is used.
dtype (Optional[str or type]): The data will be coerced to this
datatype. Can be any type specification that numpy understands.
Defaults to ``'f8'``, for IEEE754 double precision.
datatype. Can be any type specification that numpy understands,
including a structured datatype, if multiple fields are to be
loaded. Defaults to ``'f8'``, for IEEE754 double precision.
scale (Optional[:obj:`float`]): Scale factor to be multiplied into
the data.
"""
close_file = False

Expand All @@ -105,9 +110,18 @@ def __init__(self, fname, coord_frame, hdu=0, field=None, dtype='f8'):
'`BinTableHDU`.')

if field is None:
pix_val = hdu.data[:].ravel().astype(dtype)
pix_val = np.array(hdu.data[:].ravel().astype(dtype))
else:
pix_val = hdu.data[field][:].ravel().astype(dtype)
pix_val = np.array(hdu.data[field][:].ravel().astype(dtype))

if scale is not None:
names = pix_val.dtype.names
if names is None:
pix_val *= scale
else:
for n in names:
pix_val[n] *= scale

nest = hdu.header.get('ORDERING', 'NESTED').strip() == 'NESTED'

if close_file:
Expand Down
161 changes: 129 additions & 32 deletions dustmaps/planck.py
Original file line number Diff line number Diff line change
@@ -1,7 +1,7 @@
#!/usr/bin/env python
#
# planck.py
# Reads the Planck Collaboration dust reddening map.
# Reads the Planck Collaboration dust reddening maps.
#
# Copyright (C) 2016 Gregory M. Green
#
Expand Down Expand Up @@ -42,40 +42,43 @@ class PlanckQuery(HEALPixFITSQuery):
def __init__(self, map_fname=None, component='extragalactic'):
"""
Args:
map_fname (Optional[:obj:`str`]): Filename of the Planck map. Defaults to
```None``, meaning that the default location is used.
component (Optional[str]): Which measure of reddening to use. There
map_fname (Optional[:obj:`str`]): Filename of the Planck map.
Defaults to ```None``, meaning that the default location is
used.
component (Optional[:obj:`str`]): Which measure of reddening to use. There
are seven valid components. Three denote reddening measures:
``'extragalactic'``, ``'tau'`` and ``'radiance'``. Four refer to dust
properties: ``'temperature'``, ``'beta'``, ``'err_temp'`` and ``'err_beta'``.
Defaults to ``'extragalactic'``.
``'extragalactic'``, ``'tau'`` and ``'radiance'``. Four refer
to dust properties: ``'temperature'``, ``'beta'``,
``'err_temp'`` and ``'err_beta'``. Defaults to
``'extragalactic'``.
"""

if map_fname is None:
map_fname = os.path.join(
data_dir(),
'planck',
'HFI_CompMap_ThermalDustModel_2048_R1.20.fits')
'HFI_CompMap_ThermalDustModel_2048_R1.20.fits'
)

if component.lower() in ('ebv', 'extragalactic'):
if component.lower() in ('ebv','extragalactic'):
field = 'EBV'
self._scale = 1.
elif component.lower() in ('tau', 'tau353', 'tau_353', 'optical depth'):
elif component.lower() in ('tau','tau353','tau_353','optical depth'):
field = 'TAU353'
self._scale = 1.49e4
elif component.lower() in ('radiance', 'r'):
elif component.lower() in ('radiance','r'):
field = 'RADIANCE'
self._scale = 5.4e5
elif component.lower() in ('temperature', 'temp', 't'):
elif component.lower() in ('temperature','temp','t'):
field = 'TEMP'
self._scale = units.Kelvin
elif component.lower() in ('sigma_temp', 'sigma_t', 'err_temp', 'err_t'):
elif component.lower() in ('sigma_temp','sigma_t','err_temp','err_t'):
field = 'ERR_TEMP'
self._scale = units.Kelvin
elif component.lower() in ('beta', 'b'):
elif component.lower() in ('beta','b'):
field = 'BETA'
self._scale = 1.
elif component.lower() in ('sigma_beta', 'sigma_b', 'err_beta', 'err_b'):
elif component.lower() in ('sigma_beta','sigma_b','err_beta','err_b'):
field = 'ERR_BETA'
self._scale = 1.
else:
Expand All @@ -91,7 +94,10 @@ def __init__(self, map_fname=None, component='extragalactic'):
super(PlanckQuery, self).__init__(
hdulist, 'galactic',
hdu='COMP-MAP',
field=field)
field=field,
dtype='f4',
scale=self._scale
)
except IOError as error:
print(dustexceptions.data_missing_message('planck',
'Planck Collaboration'))
Expand All @@ -103,33 +109,124 @@ def query(self, coords, **kwargs):
the class was intialized) at the specified location(s) on the sky.
Args:
coords (:obj:`astropy.coordinates.SkyCoord`): The coordinates to query.
coords (:obj:`astropy.coordinates.SkyCoord`): The coordinates to
query.
Returns:
A float array of the selected Planck component, at the given
coordinates. The shape of the output is the same as the shape of the
coordinates stored by ``coords``. If extragalactic E(B-V), tau_353
input coordinate array, ``coords``. If extragalactic E(B-V), tau_353
or radiance was chosen, then the output has units of magnitudes of
E(B-V). If the selected Planck component is temperature (or
temperature error), then an :obj:`astropy.Quantity` is returned, with
units of Kelvin. If beta (or beta error) was chosen, then the output
is unitless.
temperature error), then an :obj:`astropy.Quantity` is returned,
with units of Kelvin. If beta (or beta error) was chosen, then the
output is unitless.
"""
return self._scale * super(PlanckQuery, self).query(coords, **kwargs)
return super(PlanckQuery, self).query(coords, **kwargs)


def fetch():
class PlanckGNILCQuery(HEALPixFITSQuery):
"""
Downloads the Planck Collaboration (2013) dust map, placing it in the
default ``dustmaps`` data directory.
Queries the Planck Collaboration (2016) GNILC dust map.
"""
url = 'http://pla.esac.esa.int/pla/aio/product-action?MAP.MAP_ID=HFI_CompMap_ThermalDustModel_2048_R1.20.fits'
md5 = '8d804f4e64e709f476a63f0dfed1fd11'
fname = os.path.join(
data_dir(),
'planck',
'HFI_CompMap_ThermalDustModel_2048_R1.20.fits')
fetch_utils.download_and_verify(url, md5, fname=fname)
def __init__(self, map_fname=None, load_errors=False):
"""
Args:
map_fname (Optional[:obj:`str`]): Filename of the Planck map.
Defaults to ```None``, meaning that the default location is
used.
load_errors (Optional[:obj:`str`]): If ``True``, then the error
estimates will be loaded as well, and returned with any query.
If ``False`` (the default), then queries will only return the
the reddening estimate, without any error estimate.
"""

if load_errors:
self._has_errors = True
field = None
dtype = [('EBV','f4'), ('EBV_err','f4')]
else:
self._has_errors = False
field = 'TAU353'
dtype = 'f4'

if map_fname is None:
map_fname = os.path.join(
data_dir(),
'planck',
'COM_CompMap_Dust-GNILC-Model-Opacity_2048_R2.01.fits'
)

try:
with fits.open(map_fname) as hdulist:
super(PlanckGNILCQuery, self).__init__(
hdulist, 'galactic',
hdu=1,
field=field,
dtype=dtype,
scale=1.49e4
)
except IOError as error:
print(dustexceptions.data_missing_message('planck',
'Planck GNILC'))
raise error

def has_errors(self):
"""
Returns ``True`` if the error estimates have been loaded.
"""
return self._has_errors

def query(self, coords, **kwargs):
"""
Returns E(B-V) at the specified location(s) on the sky.
Args:
coords (:obj:`astropy.coordinates.SkyCoord`): The coordinates to
query.
Returns:
If the error estimates have been loaded, then a structured array
containing ``'EBV'`` and ``'EBV_err'`` is returned. Otherwise,
returns a float array of E(B-V), at the given coordinates. The
shape of the output is the same as the shape of the input
coordinate array, ``coords``.
"""
return super(PlanckGNILCQuery, self).query(coords, **kwargs)


def fetch(which='2013'):
"""
Downloads the Planck dust maps, placing them in the default ``dustmaps``
directory. There are two different Planck dust maps that can be
downloaded: the Planck Collaboration (2013) map and the "GNILC" (Planck
Collaboration 2016) map.
Args:
which (Optional[:obj:`str`]): The name of the dust map to download.
Should be either ``2013`` (the default) or ``GNILC``.
"""
planck_maps = {
'2013': {
'url': 'http://pla.esac.esa.int/pla/aio/product-action?MAP.MAP_ID=HFI_CompMap_ThermalDustModel_2048_R1.20.fits',
'md5': '8d804f4e64e709f476a63f0dfed1fd11',
'fname': 'HFI_CompMap_ThermalDustModel_2048_R1.20.fits'
},
'GNILC': {
'url': 'http://pla.esac.esa.int/pla/aio/product-action?MAP.MAP_ID=COM_CompMap_Dust-GNILC-Model-Opacity_2048_R2.01.fits',
'md5': 'fc385c2ee5e82edf039cbca6e82d6872',
'fname': 'COM_CompMap_Dust-GNILC-Model-Opacity_2048_R2.01.fits'
}
}
if which not in planck_maps:
raise ValueError(
'Unknown map: "{}". Must be one of {}.'.format(
which, tuple(planck_maps.keys())
)
)
props = planck_maps[which]
fname = os.path.join(data_dir(), 'planck', props['fname'])
fetch_utils.download_and_verify(props['url'], props['md5'], fname=fname)


def main():
Expand Down

0 comments on commit b8df181

Please sign in to comment.