IntelLabs / bayesian-torch

A library for Bayesian neural network layers and uncertainty estimation in Deep Learning extending the core of PyTorch

Geek Repo:Geek Repo

Github PK Tool:Github PK Tool

Enable the Bayesian layer to freeze the parameters to their mean values

Nebularaid2000 opened this issue · comments

I think it would be good to provide an option to freeze the weights and biases to the mean value when inferencing.
The forward function would somehow look like this:

def forward(self, input, sample=True):
    if sample:
        # do sampling and forward as in the current code
    else:
        # set weight=self.mu_weight
        # set bias=self.mu_bias
        # (optional) set kl=0, since it is useless in this case
    return out, kl

@Nebularaid2000 Thank you for using Bayesian-Torch and the suggestion. The benefit of Bayesian layers is marginalization over weight posterior to quantify uncertainty in predictions, freezing the weights to mean value might not bring any advantage in using Bayesian NN. If the requirement is to avoid multiple stochastic forward passes, then the number of monte carlo samples during inference can be set to '1'.

Thank you for the reply! This makes sense.