Lamyaa-Zayed / Numerical-Optimization-for-Machine-Learning

implementation for different optimizations protocols: Batch Gradient Descent, Mini-Batch Gradient Descent, Stochastic Gradient Descent, Momentum Gradient Descent, NAG Algorithm, Adagrad Algorithm, RMS-Prop Algorithm, Adam Algorithm.

Geek Repo:Geek Repo

Github PK Tool:Github PK Tool

Numerical-Optimization-for-Machine-Learning

implementation for different optimizations protocols: Batch Gradient Descent, Mini-Batch Gradient Descent, Stochastic Gradient Descent, Momentum Gradient Descent, NAG Algorithm, Adagrad Algorithm, RMS-Prop Algorithm, Adam Algorithm.

About

implementation for different optimizations protocols: Batch Gradient Descent, Mini-Batch Gradient Descent, Stochastic Gradient Descent, Momentum Gradient Descent, NAG Algorithm, Adagrad Algorithm, RMS-Prop Algorithm, Adam Algorithm.


Languages

Language:Jupyter Notebook 99.4%Language:Python 0.6%