chihhao428 / a-tour-of-optimizers

A tour of different optimization algorithms in PyTorch.

Geek Repo:Geek Repo

Github PK Tool:Github PK Tool

A Tour of Optimizers in PyTorch

In this repo we'll be walking through different optimization algorithms by describing how they work and then implementing them in PyTorch.

We'll cover:

  • SGD
  • SGD with momentum
  • Adagrad
  • Adadelta
  • RMSprop
  • Adam

More may be added in the future!

The notebook is best rendered in Jupyter's NBViewer via this link. GitHub does a pretty poor job of rendering equations in notebooks.

If you find any mistakes or have any feedback, please submit an issue and I'll try and respond ASAP.

Resources

About

A tour of different optimization algorithms in PyTorch.

License:MIT License


Languages

Language:Jupyter Notebook 100.0%