Giters
katsura-jp
/
pytorch-cosine-annealing-with-warmup
Geek Repo:
Geek Repo
Github PK Tool:
Github PK Tool
Stargazers:
425
Watchers:
2
Issues:
12
Forks:
52
katsura-jp/pytorch-cosine-annealing-with-warmup Issues
method name of get_le() not consistent with torch.optim optimzers
Updated
2 months ago
Learning rate goes lower than specified min_lr
Updated
8 months ago
ModuleNotFoundError: No module named 'cosine_annealing_warmup'
Updated
2 years ago
Comments count
2
Allow `max_lr` to be set per group
Updated
2 years ago
Warmup steps only apply on the first cycle
Updated
2 years ago
Additional Features
Closed
2 years ago
Comments count
1
License?
Closed
2 years ago
Comments count
1
Is there a good reason why T_mult should be an integer
Closed
3 years ago
Comments count
2
Weird gamma behavior
Closed
2 years ago
Comments count
1
Is there possibility to add verbose=True, to see increase/decrease in lr as training progresses.
Updated
3 years ago
Comments count
1
Citation to use this scheduler
Closed
3 years ago
Comments count
2
base_lr relies on the lr of optimizer
Updated
4 years ago