Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Theta_STAR #48

Closed
vivianmiranda opened this issue Oct 16, 2019 · 7 comments
Closed

Theta_STAR #48

vivianmiranda opened this issue Oct 16, 2019 · 7 comments

Comments

@vivianmiranda
Copy link

vivianmiranda commented Oct 16, 2019

Hi

I am trying to do something that used to be quite simple in CosmoMC

I am sampling on omega_b, omega_c and H_0. I have to sample on those parameters for my test.

But I want to add a gaussian likelihood on theta_star (or theta_MC). So I try to add

def add_theory(self):
   self.theory.needs(**{
            "theta_star": None
            })

but it failed. I try to look for similar words but I couldn't find it. What am I missing?

PS: This would help - among other things - to implement the Planck 2015 compressed likelihood. But for DES purposes - we sample on H0

Best
Vivian

@Stefan-Heimersheim
Copy link

Hi Vivian,

this sounds like you just want to manually add a likelihood, so you have to add another likelihood to your cobaya dictionary

sampler['likelihood'] = {
                    'theta_star_likelihood': {
                    'external': theta_star_gaussian,
                    },
                    # other likelihoods ...

with some Gaussian

def theta_star_gaussian(theta_star):
     return scipy.stats.norm.logpdf(...)

I think there is also an even shorter way without defining the external function but the key is adding the Gaussian to your likelihoods.

Cheers,
Stefan

PS: Name of function and likelihood is not important, just the argument of the likelihood function has to match your parameter.

@vivianmiranda
Copy link
Author

vivianmiranda commented Nov 13, 2019

Thank you.

Sorry for the delay. I was in the DES meeting in the UK (super busy)

Will this work even if I am not sampling on \theta? I am sampling on H0 instead and I can't change that because my work is in the context of DES. That has been my problem - how to find \theta variable when I am sampling H0.

@Stefan-Heimersheim
Copy link

Stefan-Heimersheim commented Nov 13, 2019

Hmm, I see the problem and I can't find a solution in this case.

It should be possible to get theta (probably 100*theta_s or theta_s_1e2) out of classy and use it for the likelihood. In the cobaya documentation here it says any param that CLASS understands but I don't know how to do that, especially because 100*theta_s is a terrible variable name ...

Edit: If the parameter is added to the params block it should be available to the custom likelihood as well.

Edit2: Oh, I was thinking you are using CLASS. The variable name for CAMB is different as Antony pointed out

@cmbant
Copy link
Collaborator

cmbant commented Nov 13, 2019

Does it not work by making a new likelihood as Jesus suggested in #18? "thetastar" is a standard CAMB derived parameter so you should be able to get it from _theory (and then rescale?).

If it doesn't work a simple toy example yaml/.py to reproduce the issue might be helpful - we are working on some significant internal changes at the moment that may make more general dependencies possible and useful to have test cases.

@vivianmiranda
Copy link
Author

vivianmiranda commented Nov 13, 2019

I see, Thank you. @JesusTorrado @cmbant any idea on how I can do this?

@cmbant
Copy link
Collaborator

cmbant commented Nov 13, 2019

I haven't tried but as above but with something like this (assuming CAMB)?

def theta_star_gaussian(_theory={'thetastar'}):
     thetastar = _theory.get_param('thetastar')
    . ...

(or maybe "cosmomc_theta")

@vivianmiranda
Copy link
Author

Thank you

like_theta: "lambda _theory={'thetastar': None}: stats.norm.logpdf(_theory.get_param('thetastar'),loc=1.04092,scale=0.0031)"

worked. I was misspelling the thetastar variable (I was including an _ in between) before. You are all the best!

@cmbant cmbant closed this as completed Nov 14, 2019
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

3 participants