ZJULearning / RMI

This is the code for the NeurIPS 2019 paper Region Mutual Information Loss for Semantic Segmentation.

Geek Repo:Geek Repo

Github PK Tool:Github PK Tool

The RMI loss provides negative loss

koda12344505 opened this issue · comments

Hello,
Firstly thanks for this work

I'm currently working with this RMI loss on my own segmentation toolbox,
but i found the RMI loss provide negative loss

I just copied all codes from rmi.py and rmi_utils.py, then use this RMI loss instead of cross entropy loss

is it normal issue that RMI loss provide negative loss at the beginning of training?

thanks you

I have met the same problems

This phenomenon is normal. -log det (A) can definitely get a negative value when A is a positive definite matrix. You can also check the value of the det(A) by printing it.

The sign of the loss does not matter, the gradient matters.
You can add -100 to the normal cross entropy loss, then your loss will be a negative value, but it will not change the training process and results.

“-log det (A) can definitely get a negative value when A is a positive definite matrix.” I can't understand this sentence. For example, A=Diag(0.1,0.1,0.1),A is a positive definite matrix, but det(A)=0.001, -log(det(A)) is a positive value.