When i am training the ArbRCAN model, the training loss is so large. Is it a normal situation?
RayTan183 opened this issue · comments
When i am training the ArbRCAN model, the training is so large that it is a normal situation. Sometime the loss so so large that skip the batch. And i tried to test by using the middle weights such as 70.pth. But the result even worse than the original pre-trained model.