wangg12 / flat_anneal_scheduler.pytorch

A (warmup) (cyclic) flat and anneal learning rate scheduler in pytorch

Geek Repo:Geek Repo

Github PK Tool:Github PK Tool

(WarmUp) (Cyclic) Flat and Anneal LR Scheduler in PyTorch

warmup_method:

  • linear
  • constant

anneal_method:

  • cosine
  • (multi-)step
  • poly
  • linear
  • exp

Usage:

See test_flat_and_anneal().

Convention

  • The scheduler should be applied by iteration (or by batch) instead of by epoch.
  • anneal_point and steps are the percentages of the total iterations.
  • init_warmup_lr = warmup_factor * base_lr
  • target_lr = target_lr_factor * base_lr

About

A (warmup) (cyclic) flat and anneal learning rate scheduler in pytorch

License:MIT License


Languages

Language:Python 100.0%