Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

show inference for pi in print.summary.varbvs #24

Open
timflutre opened this issue Jul 5, 2018 · 4 comments
Open

show inference for pi in print.summary.varbvs #24

timflutre opened this issue Jul 5, 2018 · 4 comments

Comments

@timflutre
Copy link

As you say in your paper that the inference for pi is quite accurate (compare to full MCMC), I think it would be of interest to show it in the output of print.summary.varbvs. For that, if I'm not wrong, I think one could add the following commands, for instance just before the line "Selected variables by probability cutoff":

(pi.hat <- 10^(out$logodds$x0) / (1 + 10^(out$logodds$x0)))
(pi.hat.low <- 10^(out$logodds$a) / (1 + 10^(out$logodds$a)))
(pi.hat.high <- 10^(out$logodds$b) / (1 + 10^(out$logodds$b)))

and then:

cat("pi: ")
cat(sprintf("%0.3f [%0.3f,%0.3f]\n", pi.hat, pi.hat.low, pi.hat.high))

What do you think?

@pcarbo
Copy link
Owner

pcarbo commented Jul 5, 2018

@timflutre It is accurate under the right circumstances---it is not always more accurate than MCMC.

Yes, I could add a line for pi, but I chose to show the prior log-odds instead that is the hyperparameter that is used in the model.

Do you think it would be confusing to show both logodds and pi?

@timflutre
Copy link
Author

I tend to think that some users prefer pi and others logodds, hence both could be reported. To avoid confusion, you could simply add a sentence in the manual saying that one can be obtain from the other.

@pcarbo
Copy link
Owner

pcarbo commented Jul 9, 2018

@timflutre I'm going to leave this open for now, but I have added a sentence about converting between the prior log-odds and the prior inclusion probability (see commit 022e0da). Part of the issue here is that I don't have a consistent notation for the prior inclusion probability (pi is awkward).

@timflutre
Copy link
Author

@pcarbo ok, no problem, it's only a (very) small issue.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Development

No branches or pull requests

2 participants