kozistr / pytorch_optimizer

optimizer & lr scheduler & loss function collections in PyTorch

Home Page:https://pytorch-optimizers.readthedocs.io/en/latest/

Geek Repo:Geek Repo

Github PK Tool:Github PK Tool

Ranger21 does not work

BaconGabe opened this issue · comments

commented

Below is the trace when I try to use Ranger21, other optimizers work as they should

c:\users\g\appdata\local\programs\python\python38\lib\site-packages\pytorch_optimizer\ranger21.py in init(self, params, lr, beta0, betas, use_softplus, beta_softplus, num_iterations, num_warm_up_iterations, num_warm_down_iterations, warm_down_min_lr, agc_clipping_value, agc_eps, centralize_gradients, normalize_gradients, lookahead_merge_time, lookahead_blending_alpha, weight_decay, norm_loss_factor, eps)
114 # warmup iterations
115 self.num_warm_up_iterations: int = (
--> 116 self.build_warm_up_iterations(num_iterations, betas[1])
117 if num_warm_up_iterations is None
118 else num_warm_up_iterations

c:\users\g\appdata\local\programs\python\python38\lib\site-packages\pytorch_optimizer\ranger21.py in build_warm_up_iterations(total_iterations, beta2, warm_up_pct)
150 def build_warm_up_iterations(total_iterations: int, beta2: float, warm_up_pct: float = 0.22) -> int:
151 warm_up_iterations: int = math.ceil(2.0 / (1.0 - beta2)) # default un-tuned linear warmup
--> 152 beta_pct: float = warm_up_iterations / total_iterations
153 if beta_pct > 0.45:
154 return int(warm_up_pct * total_iterations)

TypeError: unsupported operand type(s) for /: 'int' and 'NoneType'

Ranger21 needs num_iterations parameter! Ranger21 Code

I just made a mistake to set a default value of num_iterations parameter to None, but need a valid integer value. I'll fix it not to assign a default value to num_iterations parameter.

Thanks for your founds :)

I just deployed v1.1.2 with the fixed version of Ranger21!

commented

It works now