Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

module.log_prior is -inf #98

Open
jordantkohn opened this issue Dec 5, 2021 · 3 comments
Open

module.log_prior is -inf #98

jordantkohn opened this issue Dec 5, 2021 · 3 comments

Comments

@jordantkohn
Copy link

I'm implementing a BayesianLinear layer implemented like this:
self.dense = BayesianLinear(opt.hidden_dim, opt.polarities_dim, freeze = False)

my loss from model.sample_elbo() returns "inf", and more specifically the module.log_prior() is "-inf".
What could be causing this issue?

bayesian module: BayesianLinear(
(weight_sampler): TrainableRandomDistribution()
(bias_sampler): TrainableRandomDistribution()
(weight_prior_dist): PriorWeightDistribution()
(bias_prior_dist): PriorWeightDistribution()
)
log_vp: tensor(-1425.3866, grad_fn=)
log_prior: tensor(-inf, grad_fn=)'

@piEsposito
Copy link
Owner

piEsposito commented Dec 5, 2021 via email

@jordantkohn
Copy link
Author

It's not really a self-contained example. But I've traced the -inf value back to this specific call in BayesianLinear module:

module.weight_prior_dist.log_prior(w)

@jordantkohn
Copy link
Author

There are negative elements in w, which leads to 0 elements in prior_pdf.
This causes -inf elements after taking the log of prior_pdf tensor.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants