You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
I think it would be good to provide an option to freeze the weights and biases to the mean value when inferencing.
The forward function would somehow look like this:
def forward(self, input, sample=True):
if sample:
# do sampling and forward as in the current code
else:
# set weight=self.mu_weight
# set bias=self.mu_bias
# (optional) set kl=0, since it is useless in this case
return out, kl
The text was updated successfully, but these errors were encountered:
@Nebularaid2000 Thank you for using Bayesian-Torch and the suggestion. The benefit of Bayesian layers is marginalization over weight posterior to quantify uncertainty in predictions, freezing the weights to mean value might not bring any advantage in using Bayesian NN. If the requirement is to avoid multiple stochastic forward passes, then the number of monte carlo samples during inference can be set to '1'.
I think it would be good to provide an option to freeze the weights and biases to the mean value when inferencing.
The forward function would somehow look like this:
The text was updated successfully, but these errors were encountered: