pursueorigin / Optimizer-PyTorch

Package of Optimizer implemented with PyTorch .

Geek Repo:Geek Repo

Github PK Tool:Github PK Tool

Optimizer-PyTorch

Package of Optimizer implemented with PyTorch .

Optimizer Lists

SGD: stochastic gradient descent

Adam: A Method for Stochastic Optimization

Adabound: Adaptive Gradient Methods with Dynamic Bound of Learning Rate

RAdam: On the Variance of the Adaptive Learning Rate and Beyond

Lookahead: Lookahead Optimizer: k steps forward, 1 step back

Optimistic

OptimAdam

OMD

ExtraGradient

STORM: STOchastic Recursive Momentum

Others

About

Package of Optimizer implemented with PyTorch .

License:Apache License 2.0


Languages

Language:Python 100.0%