sverdoot / optimizer-SUG-torch

Adaptive stochastic gradient method based on the universal gradient method. The universal method adjusts Lipsitz constant of the gradient on each step so that the loss function is majorated by the quadratic function.

Geek Repo:Geek Repo

Github PK Tool:Github PK Tool

optimizer-SUG-torch

Adaptive stochastic gradient method based on the universal gradient method. The universal method adjusts Lipsitz constant of the gradient on each step so that the loss function is majorated by the quadratic function.

Please use https://nbviewer.jupyter.org/github/sverdoot in case you have problems with rendering .ipynb files.

About

Adaptive stochastic gradient method based on the universal gradient method. The universal method adjusts Lipsitz constant of the gradient on each step so that the loss function is majorated by the quadratic function.


Languages

Language:Jupyter Notebook 84.6%Language:TeX 13.9%Language:Python 1.4%Language:PostScript 0.1%