About backward
lk1048453160 opened this issue · comments
Hi, I want to know which file is backpropagation in, about SCELoss. thanks!
I'm not sure what do you mean?
The loss function is in
Line 5 in be22ef6
I'm not sure what do you mean?
The loss function is inLine 5 in be22ef6
Hi, thanks very much! I'm a little confused, because when I read some code in the pytorch environment, every loss function has a backward function, but all of them are official loss function. SCELoss is custom, so I want to know SCELoss.backward().
Thanks very much!
I'm not sure what do you mean?
The loss function is inLine 5 in be22ef6
I find forward, not backward. just like this. thanks for your reply!
All the function used in forward() has backward() already implemented by PyTorch. So, there is no need to write custom backward() in this case.
i.e. F.softmax(). torch.log() torch.sum()
Does this help?
All the function used in forward() has backward() already implemented by PyTorch. So, there is no need to write custom backward() in this case.
i.e. F.softmax(). torch.log() torch.sum()Does this help?
yes ,thanks very much. But I think this loss function SEC, is user-defined. There should be no corresponding back propagation in pytorch. So I think we should set up our own .backward()
loss = SECLoss(out, label)
loss.backward()
Just like this.
I'm a little confused. I hope I can get your help!
Thanks again!
All the function used in forward() has backward() already implemented by PyTorch. So, there is no need to write custom backward() in this case.
i.e. F.softmax(). torch.log() torch.sum()
Does this help?yes ,thanks very much. But I think this loss function SEC, is user-defined. There should be no corresponding back propagation in pytorch. So I think we should set up our own .backward()
loss = SECLoss(out, label)
loss.backward()
Just like this.
I'm a little confused. I hope I can get your help!
Thanks again!
You do not need .backward() since it inherits from torch.nn.Module