CoinCheung / pytorch-loss

label-smooth, amsoftmax, partial-fc, focal-loss, triplet-loss, lovasz-softmax. Maybe useful

Geek Repo:Geek Repo

Github PK Tool:Github PK Tool

Regularizing Neural Networks by Penalizing Confident Output Distribution

eakirtas opened this issue · comments

Hello,

First of all, thank you for contributing such a nice repo, integrating such useful loss function in PyTorch.

According to this repo there is an implementation in your repo for Regularizing Neural Networks by Penalizing Confident Output Distribution. However, I can't find the citation of this paper in repo. Is this loss function implemented in the repo?

Thank you in advance!

Hi,

Would you please tell me which loss are you refering to? I cannot remember I have read this paper and implement loss proposed in it.

I just wondering if you implemented the proposed loss. I've scanned the repo and I didn't find anything similar to the proposed method. So I am wondering if you just using different naming and I'am missing something or indeed it isn't implemented.

Do I have potential risk of permissions if I would like to implement this(just an assumption) ? I have never considered these things before.

I pretty sure that there is no permission issue. Not only there is no permission issue, but I feel that is beneficial for the authors to implement their work since they are gain more visibility and potentially more citations. For ethical reason, we should refer to their paper in order to give them credits (as you already did in other implementations).

Anyway, maybe there is misunderstanding, I just search for an implementation to use it in my work. I am not asking you to include reference somewhere that you didn't (as I said, you already refer to papers that you implement)

So If you are interested to implement it, that will be more than helpful for me, for a lot of people as well for authors! If I implement it my own, I will open a PR to include it in your repo (if you are interested of course)

Thank you again for your awesome work