sooftware / pytorch-lr-scheduler

PyTorch implementation of some learning rate schedulers for deep learning researcher.

Geek Repo:Geek Repo

Github PK Tool:Github PK Tool

ReduceLROnPlateauScheduler bug on updating val loss

martinBmeza opened this issue · comments

In the ReduceLROnPlateauScheduler, the code is updating val_loss attribute when the current loss doesnt improve. This may be an unwanted feature, as the val_loss could be oscillating, and not allowing the scheduler to decrease the learning rate to improve the val_loss.

precisely, line 59 in reduce_lr_on_plateau_lr_scheduler.py must be deleted.