TengdaHan / DPC

Video Representation Learning by Dense Predictive Coding. Tengda Han, Weidi Xie, Andrew Zisserman.

Geek Repo:Geek Repo

Github PK Tool:Github PK Tool

About the Contrastive Loss

lixiangyu-1008 opened this issue · comments

Hi @TengdaHan,
Thanks for great work,

I haven't found the NCE Loss in your source code, Could you please show me where the implement of NCE Loss :)

commented

Check

global criterion; criterion = nn.CrossEntropyLoss()

DPC/dpc/main.py

Line 216 in ac25b1b

loss = criterion(score_flattened, target_flattened)

Thanks for your reply,
This is a CorssEntropy Loss, where the implement of dot-production as you described in Eq.(5) in paper?

commented

That's exactly the InfoNCE loss.

Some explanation:
score_flattened is a square matrix having the dot product of all Pred-GT pairs, including one positive pair in each row;
target_flattened is the index of positive pairs, the diagnal in this case.
CrossEntropyLoss is the negative LogSoftmax of the targeted class, which is exactly NCE loss.

Simple and beautiful, isn't it.

@WeidiXie @TengdaHan
Thanks for your prompt reply !
I think I have understand it !!