Lavender105 / DFF

Code for Dynamic Feature Fusion for Semantic Edge Detection https://arxiv.org/abs/1902.09104

Geek Repo:Geek Repo

Github PK Tool:Github PK Tool

Question about your loss function

lijing1996 opened this issue · comments

Since the input of your loss function is the network output without sigmoid function, I find it difficult to understand your code. What are 'max val' and 'log weight' here? Is it different from the loss function in CASENet? Could you give me a brief explanation? Thanks a lot.

class WeightedCrossEntropyWithLogits(_Loss):
def __init__(self, weight=None, size_average=None, reduce=None, reduction='elementwise_mean'):
super(WeightedCrossEntropyWithLogits, self).__init__(size_average, reduce, reduction)
def forward(self, inputs, targets):
loss_total = 0
for i in range(targets.size(0)): # iterate for batch size
pred = inputs[i]
target = targets[i]
pad_mask = target[0,:,:]
target = target[1:,:,:]
target_nopad = torch.mul(target, pad_mask) # zero out the padding area
num_pos = torch.sum(target_nopad) # true positive number
num_total = torch.sum(pad_mask) # true total number
num_neg = num_total - num_pos
pos_weight = (num_neg / num_pos).clamp(min=1, max=num_total) # compute a pos_weight for each image
max_val = (-pred).clamp(min=0)
log_weight = 1 + (pos_weight - 1) * target
loss = pred - pred * target + log_weight * (max_val + ((-max_val).exp() + (-pred - max_val).exp()).log())
loss = loss * pad_mask
loss = loss.mean()
loss_total = loss_total + loss
loss_total = loss_total / targets.size(0)
return loss_total

Hi, @lijing1996
Any idea about this?
I face the same question. The loss function is too hard to understand for me...

Hi, @lijing1996
I guess this code clip was written from torch.nn.functional.binary_cross_entropy_with_logits, which takes advantage of the log-sum-exp trick for numerical stability.

Hi all, in this Loss Function do you guys know what the pad_mask variable is? I thought the target would have one channel per Class. Whereas this functions suggests a pad_mask channel as well.

Thank you!