amaneth / Deep-learning-optimizers

Geek Repo:Geek Repo

Github PK Tool:Github PK Tool

Optimizers-implementation-from-scratch

We implement different deep learning optimizers from scratch. They are Batch Gradient Descent, Stochastic Gradient Descent, Mini-batch Gradient Descent, Momentum, Adadelta, Adagrad, RMSprop, and Adam.

About


Languages

Language:Python 100.0%