HanxunH / SCELoss-Reproduce

Reproduce Results for ICCV2019 "Symmetric Cross Entropy for Robust Learning with Noisy Labels" https://arxiv.org/abs/1908.06112

Geek Repo:Geek Repo

Github PK Tool:Github PK Tool

About backward

lk1048453160 opened this issue · comments

Hi, I want to know which file is backpropagation in, about SCELoss. thanks!

I'm not sure what do you mean?
The loss function is in

class SCELoss(torch.nn.Module):

I'm not sure what do you mean?
The loss function is in

class SCELoss(torch.nn.Module):

Hi, thanks very much! I'm a little confused, because when I read some code in the pytorch environment, every loss function has a backward function, but all of them are official loss function. SCELoss is custom, so I want to know SCELoss.backward().
Thanks very much!

I'm not sure what do you mean?
The loss function is in

class SCELoss(torch.nn.Module):

I find forward, not backward. just like this. thanks for your reply!
image

All the function used in forward() has backward() already implemented by PyTorch. So, there is no need to write custom backward() in this case.
i.e. F.softmax(). torch.log() torch.sum()

Does this help?

All the function used in forward() has backward() already implemented by PyTorch. So, there is no need to write custom backward() in this case.
i.e. F.softmax(). torch.log() torch.sum()

Does this help?

yes ,thanks very much. But I think this loss function SEC, is user-defined. There should be no corresponding back propagation in pytorch. So I think we should set up our own .backward()
loss = SECLoss(out, label)
loss.backward()
Just like this.
I'm a little confused. I hope I can get your help!
Thanks again!

All the function used in forward() has backward() already implemented by PyTorch. So, there is no need to write custom backward() in this case.
i.e. F.softmax(). torch.log() torch.sum()
Does this help?

yes ,thanks very much. But I think this loss function SEC, is user-defined. There should be no corresponding back propagation in pytorch. So I think we should set up our own .backward()
loss = SECLoss(out, label)
loss.backward()
Just like this.
I'm a little confused. I hope I can get your help!
Thanks again!

You do not need .backward() since it inherits from torch.nn.Module