robotoil / optimizer-visualization

Visualize Tensorflow's optimizers.

Geek Repo:Geek Repo

Github PK Tool:Github PK Tool

optimizer-visualization

Visualize gradient descent optimization algorithms in Tensorflow.

All methods start at the same location, specified by two variables. Both x and y variables are improved by the following Optimizers:

Adadelta documentation

Adagrad documentation

Adam documentation

Ftrl documentation

GD documentation

Momentum documentation

RMSProp documentation

For an overview of each gradient descent optimization algorithms, visit this helpful resource.

Numbers in figure legend indicate learning rate, specific to each Optimizer.

Note the optimizers' behavior when gradient is steep.

Note the optimizers' behavior when initial gradient is miniscule.

Inspired by the following GIFs:

From here

About

Visualize Tensorflow's optimizers.

License:MIT License


Languages

Language:Python 100.0%