Newbeeer / L_DMI

Code for NeurIPS 2019 Paper, "L_DMI: An Information-theoretic Noise-robust Loss Function"

Geek Repo:Geek Repo

Github PK Tool:Github PK Tool

Negative loss values

EricArazo opened this issue · comments

Hello!

Is it normal to have negative values in the LDMI loss? If so, do you have any intuition on how this affects backpropagation when the loss values are around zero? I am encountering some instabilities in such cases (occurring for 60% of uniform random noise in CIFAR-10, when 60% means that 60% of the labels are incorrect, not just random).

Thanks in advance.
Best,
Eric

Hi,

Very sorry for the huge delay. Yes, it's normal to have negative loss since the abs(det(mat))>1 with very high probability. What is the batch size you used? Too small batch size can cause the loss values to be around zero.

We have the experimental results for your case (please refer to appendix D in our paper ). Your case is CIFAR-10 - class-independent noise, with noise amount r=0.6 .

Best,
Yilun

Hello Yilun,

No worries and thank you for your answer! I will take a look at the appendix.

Best,
Eric

Hi Eric,

The experimental results seem not so promising. I guess all the baselines and our method didn't improve the original cross-entropy loss for some reasons ...

Yilun