Loss Function Compute
zhangyanide opened this issue · comments
zhangyanide commented
In the definition of Gaussian2DLikelihood, you calculate the density function, when the result of density function >1 , result = -torch.log(torch.clamp(result, min=epsilon)), this value will <0, the loss < 0. I think the probability value is between 0-1, and the cross entropy should be > 0. Is it right, look forward your reply
yuansheng commented
计算连续变量的log-loss这件事就很让人疑惑,但原文里也是这个意思,不太理解...而且概率密度函数中一个点的值并没有什么意义