collin-burns / pytorch-loss

label-smooth, amsoftmax, focal-loss, triplet-loss. Maybe useful

Geek Repo:Geek Repo

Github PK Tool:Github PK Tool

pytorch-loss

My implementation of label-smooth, amsoftmax, focal-loss, dual-focal-loss, triplet-loss, giou-loss, affinity-loss, pc_softmax_cross_entropy, and dice-loss(both generalized soft dice loss and batch soft dice loss). Maybe this is useful in my future work.

Also tried to implement swish and mish activation functions.

Additionally, one-hot function is added.

Newly add an "Exponential Moving Average(EMA)" operator.

For those who happen to find this repo, if you see errors in my code, feel free to open an issue to correct me.

About

label-smooth, amsoftmax, focal-loss, triplet-loss. Maybe useful

License:MIT License


Languages

Language:Python 76.3%Language:Cuda 20.9%Language:C++ 2.8%