-
-
Notifications
You must be signed in to change notification settings - Fork 106
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
The number of parameters is doubled #89
Comments
My current solution is: count = 0
for name, param in net.named_parameters():
if ("sampler" not in name) and param.requires_grad:
count += param.numel()
print(count) |
I am also wondering what these parameters mean respectively, thank you |
In Bayesian neural networks, each parameter ("weight" and "bias") is a random variable related to a distribution, which is Gaussian here. |
Thanks for the reply! I knew that but I mean more that from what I have seen with BNNs in general, the weight distribution is rarely a perfect normal dist centered at mu and sigma (it usually is more of a gaussian mixture) but here, every weight dist I obtain are like that. Is it variational inference that always gives perfect normal distributions? |
In the paper "Weight Uncertainty in Neural Networks", the authors applied Gaussian variational posterior and scale mixture prior. |
Here is the simplest example.
The output is:
The parameters are
fc1.weight_mu
,fc1.weight_rho
,fc1.bias_mu
,fc1.bias_rho
,fc1.weight_sampler.mu
,fc1.weight_sampler.rho
,fc1.bias_sampler.mu
,fc1.bias_sampler.rho
, respectively, which is double what is expected.The text was updated successfully, but these errors were encountered: