Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Enable the Bayesian layer to freeze the parameters to their mean values #8

Closed
Nebularaid2000 opened this issue Nov 26, 2021 · 2 comments

Comments

@Nebularaid2000
Copy link

I think it would be good to provide an option to freeze the weights and biases to the mean value when inferencing.
The forward function would somehow look like this:

def forward(self, input, sample=True):
    if sample:
        # do sampling and forward as in the current code
    else:
        # set weight=self.mu_weight
        # set bias=self.mu_bias
        # (optional) set kl=0, since it is useless in this case
    return out, kl
@ranganathkrishnan
Copy link
Contributor

@Nebularaid2000 Thank you for using Bayesian-Torch and the suggestion. The benefit of Bayesian layers is marginalization over weight posterior to quantify uncertainty in predictions, freezing the weights to mean value might not bring any advantage in using Bayesian NN. If the requirement is to avoid multiple stochastic forward passes, then the number of monte carlo samples during inference can be set to '1'.

@Nebularaid2000
Copy link
Author

Thank you for the reply! This makes sense.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants