Skip to content

Commit

Permalink
update docs
Browse files Browse the repository at this point in the history
  • Loading branch information
apetri committed May 4, 2016
1 parent 0b5624a commit 3372283
Show file tree
Hide file tree
Showing 6 changed files with 116 additions and 4 deletions.
2 changes: 1 addition & 1 deletion .travis.yml
Original file line number Diff line number Diff line change
Expand Up @@ -34,7 +34,7 @@ before_install:

#Install packages
install:
- conda install --yes python=$TRAVIS_PYTHON_VERSION atlas numpy scipy matplotlib nose dateutil pandas statsmodels astropy
- conda install --yes python=$TRAVIS_PYTHON_VERSION atlas numpy scipy sqlalchemy matplotlib nose dateutil pandas statsmodels astropy

#Coverage packages are on Dan Blanchard binstar channel
- conda install --yes -c dan_blanchard python-coveralls nose-cov
Expand Down
25 changes: 25 additions & 0 deletions docs/source/examples/parameter_priors.rst
Original file line number Diff line number Diff line change
@@ -0,0 +1,25 @@
A quick and dirty way to incorporate parameter priors
=====================================================

Suppose you carried on your weak lensing analysis all the way to the parameter constraints, and you were able to estimate your parameter covariance matrix :math:`\Sigma_{lens}` (either from simulated or real data). Now suppose you are interested in understanding how these constraints change when you add prior information from say CMB observations from Planck. These prior results will become available through their own parameter covariance matrix :math:`\Sigma_{CMB}`, which may, or may not, have the same dimensions and parametrization as :math:`\Sigma_{lens}`. Applying the prior to the parameters considered in the weak lensing analysis and fixing all the others is equivalent to take the appropriate parameter slice of :math:`\Sigma_{CMB}^{-1}` and adding the Fisher matrices

.. math:: \Sigma_{lens+CMB} = (\Sigma_{lens}^{-1}+\Sigma_{CMB}^{-1})^{-1}

This can be readily done with the functionality embedded in the :py:class:`~lenstools.statistics.ensemble.SquareMatrix` class, with the following code

::

from lenstools.statistics.ensemble import SquareMatrix

#Read in parameter covariances
lens_pcov = SquareMatrix.read("lenscov.pkl")
cmb_cov = SquareMatrix.read("cmbcov.pkl")

#Parametrization
parameters = ["Om","w","sigma8"]

#Add the Fisher matrices
fisher_lens_cmb = lens_pcov.invert()[parameters] + cmb_cov.invert()[parameters]

#pcov_lens_cmb is the parameter covariance subject to the prior
pcov_lens_cmb = fisher_lens_cmb.invert()
84 changes: 84 additions & 0 deletions docs/source/examples/sampling.rst
Original file line number Diff line number Diff line change
@@ -0,0 +1,84 @@
Three different ways to do parameter sampling
=============================================

This code snipped shows how to use LensTools to perform cosmological parameter sampling with three diffenent methods: direct evaluation of the likelihood, Fisher Matrix and MCMC

::


from lenstools.statistics.ensemble import Series,Ensemble
from lenstools.statistics.constraints import Emulator
from lenstools.statistics.contours import ContourPlot
import numpy as np
import matplotlib.pyplot as plt


def lt_sample(emulator,test_data,covariance,p_value=0.68):

#Check that the data types are correct
assert isinstance(emulator,Emulator)
assert isinstance(test_data,Series)
assert isinstance(covariance,Ensemble)

#Plot setup
fig,ax = plt.subplots(figsize=(8,8))

#Map the likelihood in the OmegaM-sigma8 plane, fix w to -1
p = Ensemble.meshgrid({
"Om":np.linspace(0.2,0.5,50),
"sigma8":np.linspace(0.6,0.9,50)
})

p["w"] = -1.

#Compute the chi squared scores of the test data on a variety of parameter points
scores = emulator.score(p,test_data,covariance,correct=1000,method="chi2")
scores["likelihood"] = np.exp(-0.5*scores[emulator.feature_names[0]])

contour = ContourPlot.from_scores(scores,parameters=["Om","sigma8"],
feature_names=["likelihood"],
plot_labels=[r"$\Omega_m$",
r"$\sigma_8$"],fig=fig,ax=ax)

contour.getLikelihoodValues([p_value],precision=0.01)
contour.plotContours(colors=["red"])
contour.labels()

#Approximate the emulator linearly around the maximum (Fisher matrix)
fisher = emulator.approximate_linear(center=(0.26,-1.,0.8))

#Consider (OmegaM,sigma8) only
fisher.pop(("parameters","w"))
fisher = fisher.iloc[[0,1,3]]

#Fisher confidence ellipse
ellipse = fisher.confidence_ellipse(covariance,
correct=1000,
observed_feature=test_data,
parameters=["Om","sigma8"],
p_value=p_value,
fill=False,
edgecolor="blue")

ax.add_artist(ellipse)

#MCMC sampling of (OmegaM,sigma8)
samples = emulator.sample_posterior(test_data,
features_covariance=covariance,
correct=1000,
pslice={"w":-1},
sample="emcee")[emulator.feature_names[0]]

ax.scatter(samples["Om"],samples["sigma8"],marker=".",color="black",s=1)
ax.set_xlim(0.2,0.5)
ax.set_ylim(0.6,0.9)

#Save the figure
fig.tight_layout()
fig.savefig("parameter_sampling.png")

And this is the resulting figure:

.. figure:: ../figures/parameter_sampling.png
Binary file added docs/source/figures/parameter_sampling.png
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
4 changes: 3 additions & 1 deletion docs/source/gallery.rst
Original file line number Diff line number Diff line change
Expand Up @@ -13,4 +13,6 @@ Gallery
examples/eb_decomposition
examples/design
examples/gadget_io
examples/confidence_contours
examples/confidence_contours
examples/sampling
examples/parameter_priors
5 changes: 3 additions & 2 deletions docs/source/index.rst
Original file line number Diff line number Diff line change
Expand Up @@ -78,6 +78,7 @@ Dependencies

.. _numpy: http://www.numpy.org
.. _scipy: http://www.scipy.org
.. _sqlalchemy: http://www.sqlalchemy.org
.. _astropy: http://www.astropy.org
.. _pandas: http://pandas.pydata.org
.. _emcee: http://dan.iel.fm/emcee/current/
Expand All @@ -90,12 +91,12 @@ Dependencies
.. _NICAEA: http://www.cosmostat.org/software/nicaea/
.. _fftw3: http://www.fftw.org

The core features require the standard numpy_, scipy_ , and additionally astropy_ (mainly for the cosmology and measure units support) and emcee_ (from which LensTools borrows the MPI Pool utility), and the Test suite requires additionally the matplotlib_ package. matpoltlib should eventually be installed if you want to use the plotting engines of LensTools. If you want to run the calculations in parallel on a computer cluster you will need to install mpi4py_ (a python wrapper for the MPI library). Installation of all these packages is advised (if you run astrophysical data analyses you should use them anyway). One of the lenstools features, namely the :py:class:`~lenstools.simulations.Design` class, requires that you have a working version of GSL_ to link to; if you don't have one, just hit *enter* during the installation process and the package will work correctly without this additional feature. The installation if the NICAEA_ bindings additionally requires a working installation of the fftw3_ library.
The core features require the standard numpy_, scipy_ , and additionally astropy_ (mainly for the cosmology and measure units support) and emcee_ (from which LensTools borrows the MPI Pool utility), and the Test suite requires additionally the matplotlib_ package. matpoltlib should eventually be installed if you want to use the plotting engines of LensTools. If you want to use the SQL database querying shortcuts embedded in LensTools, you will need the sqlalchemy_ package too. If you want to run the calculations in parallel on a computer cluster you will need to install mpi4py_ (a python wrapper for the MPI library). Installation of all these packages is advised (if you run astrophysical data analyses you should use them anyway). One of the lenstools features, namely the :py:class:`~lenstools.simulations.Design` class, requires that you have a working version of GSL_ to link to; if you don't have one, just hit *enter* during the installation process and the package will work correctly without this additional feature. The installation if the NICAEA_ bindings additionally requires a working installation of the fftw3_ library.

Test
----

To check that everything works before installing you can run the pre implemented test suite that comes with the source code. First you will need to install pytest_, then you need to download some data files (mainly FITS images) that the test suite depends on. You need to set the environment variable LENSTOOLS_DATA to the path where you want your data to be downloaded (for a manual download the data file can be found here_, it is almost 250MB). After that, in a python shell, type
To check that everything works before installing you can run the pre implemented test suite that comes with the source code. First you will need to install pytest_, then you need to download some data files (mainly FITS images) that the test suite depends on. You need to set the environment variable LENSTOOLS_DATA to the path where you want your data to be downloaded (for a manual download the data file can be found here_, it is roughly 300MB). After that, in a python shell, type

::

Expand Down

0 comments on commit 3372283

Please sign in to comment.