ifeherva / optimizer-benchmark

Benchmark Suite for Stochastic Gradient Descent Optimization Algorithms in Pytorch

Geek Repo:Geek Repo

Github PK Tool:Github PK Tool

Benchmark Suite for Stochastic Gradient Descent Optimization Algorithms in Pytorch

This repository contains code to benchmark novel stochastic gradient descent algorithms on the CIFAR10 dataset.

If you want your algorithm to be included open an issue here.

Requirements: Python 3.6+, Pytorch 1.3+, tqdm

Supported optimizers:

  1. Stochastic Gradient Descent with Momentum (SGDM)
  2. Stochastic Gradient Descent with Aggregated Momentum (SGD_aggmo) [arXiv]
  3. Stochastic Gradient Descent with Momentum and Learning Rate Dropout (SGD_LRD) [arXiv]
  4. Adam: A method for stochastic optimization (ADAM) [arXiv]
  5. Adam with Learning Rate Dropout (ADAM_LRD) [arXiv]
  6. RMSProp [Lecture Notes]
  7. RMSProp with Learning Rate Dropout [arXiv]
  8. RAdam: On the Variance of the Adaptive Learning Rate and Beyond [arXiv]
  9. RAdam with Learning Rate Dropout [arXiv]
  10. AdaBound: Adaptive Gradient Methods with Dynamic Bound of Learning Rate [ICLR2019]
  11. AdamW: Decoupled Weight Decay Regularization [arxiv]
  12. Coolmomentum: Stochastic Optimization by Langevin Dynamics with Simulated Annealing [Nature Scientific Reports]

Results:

More details are of all runs can be found here.

About

Benchmark Suite for Stochastic Gradient Descent Optimization Algorithms in Pytorch

License:Apache License 2.0


Languages

Language:Python 94.5%Language:Jupyter Notebook 5.5%