jareducherek / tutor-grad-mlp

An explicit back propagation example in numpy for MLP on MNIST.

Geek Repo:Geek Repo

Github PK Tool:Github PK Tool

tutor-grad-mlp

An explicit back propagation example in numpy for MLP on MNIST. Uses gradient descent with momentum to achieve acceptable accuracy on vectorized inputs of the MNIST images. Check out the notebook for an example on how to create the network, train it, and evaluate it.



Currently working activation functions:

  • softmax (last layer only)
  • sigmoid
  • relu
  • tanh

Currently working layer functions:

  • fully connected with bias

About

An explicit back propagation example in numpy for MLP on MNIST.

License:MIT License


Languages

Language:Jupyter Notebook 90.2%Language:Python 9.8%