CoinCheung / pytorch-loss

label-smooth, amsoftmax, partial-fc, focal-loss, triplet-loss, lovasz-softmax. Maybe useful

Geek Repo:Geek Repo

Github PK Tool:Github PK Tool

AM-softmax implement details

sweetTT opened this issue · comments

when use am-softmax loss, do we need to add another fully connect layer before using AM-softmax loss?

such as Avg_pooling-->fc-->amsoftmax?

That would depend on the dimension of you features. Amsoftmax loss is used to train embedding networks, if your feature embeddings have dimension different from the avg pooling, you should add another fc layer to adjust the embedding dimension.

I am closing this. You can still leave a message if you have more to discuss.