graydonhope / Digit-Recognition

Neural Network with Back Propagation to Recognize Handwritten Digits

Geek Repo:Geek Repo

Github PK Tool:Github PK Tool

Digit-Recognition

Training Accuracy: 96.7%

Implementing a neural network and applying it to hand-written digit recognition. The program is able to successfully recognizie digits and classify them accordingly.

Some theory about the project...

Regularized Cost Function

The cost function for a neural network with 3 layers is given by: image

Sigmoid Gradient

The backpropagation algorithm requires the gradient of the sigmoid function to be computed. Sigmoid Function: image

Sigmoid Gradient: image

Backpropagation Implementation

Gradients with Regularization: image

Where delta equals: image and image

Used to compute: image

Gradient Checking

An import part of using a neural network is numerically validating your implementation to confirm accurate gradient results. This numerical function uses an alternative approach to compute the derivatives: image

Which gives: image

The gradient check is used for a few test values, confirms the backpropagation algorithm implementation is correct, then turns off as it is computationally inefficient.

This implementation then uses fmincg to learn the parameter values.

Some Dataset Examples

image

Based off of Stanford's Machine Learning Course taught by professor Andrew Ng.

About

Neural Network with Back Propagation to Recognize Handwritten Digits


Languages

Language:MATLAB 100.0%