davidtvs / pytorch-lr-finder

A learning rate range test implementation in PyTorch

Geek Repo:Geek Repo

Github PK Tool:Github PK Tool

Suggested LR not returned when min_grad_idx is 0 in plot()

manuel-munoz-aguirre opened this issue · comments

When using the plot function in a situation like the following:
starting_lr_fulltrain_DEBUG
where the first value in lrs[min_grad_idx] is the suggested learning rate, even if the suggested learning rate is printed, it is not returned.

Expected behavior: return ax, lrs[min_grad_idx]
Observed behavior: return ax

These seem to be the relevant lines. Seems that ax is returned due to min_grad_idx evaluating to False (because it is 0):

if suggest_lr and min_grad_idx:
return ax, lrs[min_grad_idx]
else:
return ax

@manuel-munoz-aguirre You are right, the condition should be ... and min_grad_idx is not None: which is the same as line 510 since it's a numerical value:

"Failed to compute the gradients, there might not be enough points."
)
if min_grad_idx is not None:
print("Suggested LR: {:.2E}".format(lrs[min_grad_idx]))
ax.scatter(

Thanks for the feedback! I'll send a patch for it.

Thanks for raising the issue @manuel-munoz-aguirre and thanks to @NaleRaphael for fixing it in #66

The fix has been merged, closing this issue.