CamNZ / gradient-descent-from-scratch

A two part tutorial series implementing the gradient descent algorithm without the use of any machine learning libraries

Geek Repo:Geek Repo

Github PK Tool:Github PK Tool

Gradient descent from scratch

A two part tutorial series implementing the gradient descent algorithm without the use of any machine learning libraries.

Background

Gradient descent is a very commonly used optimization method in modern machine learning. This two part series is intended to help people gain a better understanding of how it works by implementing it without the use of any machine learning libraries.

A basic understanding of calculus, linear algebra and python programming are required to get the most out of these tutorials.

Format

Jupyter notebooks that contain explanations of underlying concepts followed by code that can be run from within the notebook.

Part 1 - Intoduction to gradient descent on a simple linear regression problem

Part 2 - Training a neural network to classify handwritten digits

Code is written for readability and is heavily commented to aid beginners. Not an exemplar of production code.

Prerequisites

Jupyter Notebook
Python 3x
Numpy, matplotlib

Usage

Feedback

Any constructive feedback on how this could be improved is welcome.

About

A two part tutorial series implementing the gradient descent algorithm without the use of any machine learning libraries


Languages

Language:Jupyter Notebook 100.0%