Skip to content
Compressed CMB likelihood code that uses the MOPED (Multiple/Massively Optimised Parameter Estimation and Data compression) compression scheme to compress the Planck power spectrum
Python
Branch: master
Clone or download
Fetching latest commit…
Cannot retrieve the latest commit at this time.
Permalink
Type Name Latest commit message Commit time
Failed to load latest commit information.
compression_vectors
.gitignore
Dl_planck2015fit.dat
README.md
cosmoped_likelihood.py
cosmoped_likelihood_example.py

README.md

CosMOPED: a compressed Planck likelihood

CosMOPED=Cosmological MOPED

To compute the likelihood for the LambdaCDM model using CosMOPED you only need 6 compression vectors (one for each parameter) and 6 numbers (from compressing the Planck data using the 6 compression vectors). Using these, the likelihood of a theory power spectrum given the Planck data is just the product of 6 one-dimensional Gaussians. Extended cosmological models just require computing extra compression vectors. For more details on how this works see https://arxiv.org/abs/1909.05869

We apply the Massively Optimized Parameter Estimation and Data compression technique (MOPED, see Heavens, Jimenez & Lahav, 2000) to the public Planck 2015 temperature likelihood, and the Planck 2018 temperature and polarization likelihoods, reducing the dimensions of the data space to one number per parameter of interest.

Required packages

To use the log likelihood function:

  • numpy
  • scipy

Additional requirement for creating compression vectors:

Usage

For a quick start example see cosmoped_likelihood_example.py

Compression vectors

The Lambda CDM compression vectors are pre-computed so if that's what you want then skip straight to the likelihood section.

If you want to create compression vectors for a different cosmological model you can do this by running

python compression_vectors.py inifiles/settings.ini

where the settings.ini inifile points to the appropriate compression_inifile which specifies which parameters to calculate compression vectors for and what their fiducial values should be.

NB: the naming conventions for parameters in the compression inifile are the same as for the CLASS python wrapper, so omega_b = Ωbh2 and omega_cdm = ΩCDMh2 etc

Likelihood

  1. import the CosMOPED class
from cosmoped_likelihood import CosMOPED
  1. initialize a CosMOPED object, specifying
  • path: to CosMOPED compression vectors for the parameters you are interested in
  • year: 2015 or 2018 to use the Planck 2015 or 2018 data releases
  • spectra: 'TT' for just temperature, or 'TTTEEE' for TT, TE and EE spectra
  • use_low_ell: True to use two low-l temperature bins, False to use just l>=30 data
path='../compression_vectors/output/LambdaCDM/'
TT2018_LambdaCDM=CosMOPED(path, year=2018, spectra='TT', use_low_ell_bins=False)

A note on compression vectors:

  • The CosMOPED compression vectors for the ΛCDM parameters (h, omega_b, omega_cdm, tau_reio, A_s, n_s) already exist in compression_vectors/output, so to get the log likelihood for these you don't need to make any new compression vectors.
  • note omega_b = Ωbh2 and omega_cdm = ΩCDMh2 (CLASS python wrapper naming conventions)
  1. call the likelihood function with your theoretical TT, TE, and EE spectra (from e.g. CLASS or CAMB)
loglike=TT2018_LambdaCDM.loglike(Dltt, Dlte, Dlee, ellmin)

Note:

  • the log likelihood function expects the spectra in the form Dl=l(l+1)/2π Cl
  • Dltt, Dlte and Dlee should all cover the same l range, usually from a minimum l value of 0 or 2
  • ellmin=2 by default; if your spectra start at l=0 then specify this with ellmin=0

Please cite

Planck 2018 likelihood paper or Planck 2015 likelihood paper (arXiv version) depending on which data you use, because we use datafiles from the Planck plik-lite public likelihood code

Our paper: https://arxiv.org/abs/1909.05869

You can’t perform that action at this time.