katsura-jp / pytorch-cosine-annealing-with-warmup

Geek Repo:Geek Repo

Github PK Tool:Github PK Tool

base_lr relies on the lr of optimizer

DietDietDiet opened this issue · comments

Hi, I read through your blog and it is really nice work!
I found the attribute base_lr in your class extends the one in basic scheduler, which relies on the lr of optimizer. So could you clarify that if I should set the base_lr with optimizer, and max lr with scheduler?