target_label is always zero
JiwenZ opened this issue · comments
self.register_buffer(
'target_label',
torch.zeros(self.train_args.per_device_train_batch_size, dtype=torch.long)
)
I don't think target_label is given any value after initialization. So the labels are always zeros.
Please correct me if I'm wrong.
I got it, that is how contrastive loss works