quancore / social-lstm

Social LSTM implementation in PyTorch

Geek Repo:Geek Repo

Github PK Tool:Github PK Tool

Loss Function Compute

zhangyanide opened this issue · comments

In the definition of Gaussian2DLikelihood, you calculate the density function, when the result of density function >1 , result = -torch.log(torch.clamp(result, min=epsilon)), this value will <0, the loss < 0. I think the probability value is between 0-1, and the cross entropy should be > 0. Is it right, look forward your reply

计算连续变量的log-loss这件事就很让人疑惑,但原文里也是这个意思,不太理解...而且概率密度函数中一个点的值并没有什么意义