TinyZeaMays / CircleLoss

Pytorch implementation of the paper "Circle Loss: A Unified Perspective of Pair Similarity Optimization"

Geek Repo:Geek Repo

Github PK Tool:Github PK Tool

bug?

interestingzhuo opened this issue · comments

commented

loss= self.soft_plus(torch.logsumexp(logit_n, dim=0) + torch.logsumexp(logit_p, dim=0))
这行代码中的'+'是不是应该是'*'感觉和原始论文中的描述不一样

There is a log in torch.logsumexp.
log(a * b) = log(a) + log(b)

@TinyZeaMays , Accodring the paper. We have the Circle_loss is "log(1+ exp(logit_p)*exp(logit_n)".
I can't see 1 in your final formula.

commented

@TinyZeaMays , Accodring the paper. We have the Circle_loss is "log(1+ exp(logit_p)*exp(logit_n)".
I can't see 1 in your final formula.

self.soft_plus(x) is log(1+x)