Is there any way to avoid overfitting while training over 200epoch?
Chanuku opened this issue · comments
mightylg9094 commented
Jun-Yan Zhu commented
It seems that your learning rate has turned negative. This might cause the training failure.
mightylg9094 commented
Thanks for the reply,
If you know, can you give me any clue about which parameter should I fix to avoid this?
Jun-Yan Zhu commented
You need to update your learning rate decay policy.