min_learning_rate in ReduceLROnPlateau scheduler
Adamits opened this issue · comments
Adam commented
My labmate was using the ReduceOnPlateau
scheduler and gets an error related tot he arg min_learning_rate
.
This is because we inherit from the pytorch scheduler, which uses min_lr
. It looks like we reverted this a couple months ago and changed min_lr
to minimum_lr
here.
Reverting it back should fix the bug.