bug?
interestingzhuo opened this issue · comments
ZhuoL commented
loss= self.soft_plus(torch.logsumexp(logit_n, dim=0) + torch.logsumexp(logit_p, dim=0))
这行代码中的'+'是不是应该是'*'感觉和原始论文中的描述不一样
TinyZeaMays commented
There is a log in torch.logsumexp.
log(a * b) = log(a) + log(b)
Mai Đức Hải commented
@TinyZeaMays , Accodring the paper. We have the Circle_loss is "log(1+ exp(logit_p)*exp(logit_n)".
I can't see 1 in your final formula.
ZhuoL commented
@TinyZeaMays , Accodring the paper. We have the Circle_loss is "log(1+ exp(logit_p)*exp(logit_n)".
I can't see 1 in your final formula.
self.soft_plus(x) is log(1+x)
Mai Đức Hải commented
@interestingzhuo many thanks 👍