Printing Wrong Loss
myavkat opened this issue · comments
Myavvv Katrancı commented
While printing the loss of the epoch in the train function, sum_loss is divided by N which is the dataset's size. But the losses are added to sum_loss N/args.batchsize times.