exc4l / gradient-descent-optimizers-linear-regression

Gradient descent optimizers for linear regression

Geek Repo:Geek Repo

Github PK Tool:Github PK Tool

gradient-descent-optimizers-linear-regression

Gradient descent optimizers for linear regression.

Here are the implemented algorithms :

  • Vanilla gradient descent
  • Momentum and batch
  • Adagrad
  • RMSProp
  • Adam
  • Adamax
  • Nesterov Accelerated Gradient
  • Nadam

It compares the regression functions and the error evolution.

TODO:
  • Fix Adadelta optimizer

About

Gradient descent optimizers for linear regression


Languages

Language:Jupyter Notebook 81.6%Language:Python 18.4%