New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Extra -0.5 in log_posterior()? #58
Comments
Thanks for the reply. The package is really a nice work! But I'm still confused with the -0.5 there. Let me try to explain my question step by step in case I misunderstand anything. With reference to the Bayes by Backprop paper https://arxiv.org/pdf/1505.05424.pdf, Eq. (1) is the cost function that can be computed with Monte Carlo samples: This cost function is implemented in blitz-bayesian-deep-learning/blitz/utils/variational_estimator.py Lines 63 to 68 in e150403
The blitz-bayesian-deep-learning/blitz/losses/kl_divergence.py Lines 7 to 17 in e150403
In the function, there are blitz-bayesian-deep-learning/blitz/modules/linear_bayesian_layer.py Lines 92 to 93 in e150403
The variational posteriors of the weights and the bias are computed using the blitz-bayesian-deep-learning/blitz/modules/weight_sampler.py Lines 36 to 51 in e150403
, and the proposal distribution of q is chosen to be Gaussian, then the |
I agree, and the extra -0.5 in |
I agree |
Yes, I think so as well. The -0.5 in the end is wrong. |
@isaac-cfwong sorry for the ultra-late reply. I agree with you, it should be removed. As it was your idea, I can leave it to you to do the PR or, if you don't want to, I can do it myself. Thank you so much for finding that mistake of mine and solving it. |
Closing it due to staleness. |
In https://github.com/piEsposito/blitz-bayesian-deep-learning/blob/master/blitz/modules/weight_sampler.py#L50,
why is there a
-0.5
at the end of the line? The log-likelihood of a Gaussian does not have that-0.5
.The text was updated successfully, but these errors were encountered: