LOG issue
jzhanghzau opened this issue · comments
jzhanghzau commented
Bug description
As shown in the below's screenshot, 710144 is my total number of samples, but 100 is the batch amount. Since my batch size is 64, so I expect the total number is 710144/64=11096, I think 11096 should be at the position of 710144. Can someone explain me this? Which makes me a little bit confused.
This is the way I log during training step:
self.log('train_loss', loss, on_step=True, rank_zero_only=True)'
Thanks in advance!
JJ
jzhanghzau commented