CoinCheung / pytorch-loss

label-smooth, amsoftmax, partial-fc, focal-loss, triplet-loss, lovasz-softmax. Maybe useful

Geek Repo:Geek Repo

Github PK Tool:Github PK Tool

How do I design my loss fuction like smoothL1?

kelisiya opened this issue · comments

I want to design a loss fuction when GT is [0,1] use L2 loss ;[0-1] use L1 loss ;
How to achieve it use pytorch?
Thanks

I have not seen loss like that, maybe you can implement that directly with pytorch operators.

Maybe you can try torch.where to switch between the two losses according to the input conditions.

what do you mean by "del 0 or 1" ?

As far as I know, we do not need to consider single points when we compute loss in practice. You need continuous loss funtions to compute the gradient, it will not be meaningful if your loss function has singular points that cannot compute the graident.