Russolves / Coding-step-size-optimizers

Manually hand-coding in step-size optimizers (momentum and Adam) for deep learning neural networks

Geek Repo:Geek Repo

Github PK Tool:Github PK Tool

Coding-step-size-optimizers

Manually hand-coding step-size optimizers (momentum and Adam) for deep learning neural networks. This was achieved through further modification of the logic used in stochastic gradient descent.

Project Description

Project contains a ComputationalGraphPrimer file that contains all the classes required for training model through stochastic gradient descent (SGD). The 2 other files for single-neuron classifier and multi-neuron classifier contain classes with functions overwritten in order to achieve stochastic gradient descent with momentum and Adam.

About

Manually hand-coding in step-size optimizers (momentum and Adam) for deep learning neural networks


Languages

Language:Python 100.0%