ExplainableML / BayesCap

(ECCV 2022) BayesCap: Bayesian Identity Cap for Calibrated Uncertainty in Frozen Neural Networks

Geek Repo:Geek Repo

Github PK Tool:Github PK Tool

The "target" term in equation 10

huangchaoxing opened this issue · comments

Hi,
When reading the code of the TempCombLoss(), I found that you simply give the same target to L1 and L2.
However, it seems the identity mapping term and the Negative log-likelihood term use different target term in Equation 10. If I understand correctly, L1 should use the tensor generated by the frozen model as the target while L2 should use the reference from the dataset as the target. In the paper, there appears two different y term for target in equation 10(y\hat and y).
Could you help explain why the implementation use the same target term ? Correct me if I make a wrong interpretation.

l1 = self.L_l1(mean, target)
l2 = self.L_GenGauss(mean, one_over_alpha, beta, target)
l = T1l1 + T2l2

Thank you very much.

Hey @huangchaoxing,

Thanks for pointing this out. I think this version of the losses.py is still from some of the exploratory phase and therefore not the correct one.
Indeed, the signature of the loss function should be like

TempCombLoss.forward(
mean: Tensor, one_over_alpha: Tensor, beta: Tensor, target_1: Tensor, target_2: Tensor, 
T1: float, T2: float
)

where target_1 is for self.L1 and target_2 is for self.L_GenGauss as you pointed out.

Would you like to make a pull request? :)
I will be happy to accept it.

(If not, I will update the file soon).

Thanks!