ZJULearning / RMI

This is the code for the NeurIPS 2019 paper Region Mutual Information Loss for Semantic Segmentation.

Geek Repo:Geek Repo

Github PK Tool:Github PK Tool

How to weight some examples if loss may be negative

shkarupa-alex opened this issue · comments

In most cases when we have loss >= 0 with shape [batch_size] and we want to weight up importance of some examples we would multiply loss by weight. E.g. loss = [0.1, 0.3], weights = [2., 1.], weighted_loss = [0.2, 0.3]

But how should we do that for RMI loss that may be negative?
E.g. loss = [-0.1, -0.3], weights = [2., 1.], weighted_loss = [-0.2, -0.3]. In this example weighted loss will be smaller instead of expected "larger".

Should we multiply loss by weights or divide?

Well, I guess we can find a lower bound of RMI, eg., -1000, and plus the negative loss value with RMI, then achieve re-weighting.

Sorry for this late reply.
This repo is old and I am busy with other projects.
It will not be maintained in the future.