aysebilgegunduz / FeedFwBackProp

Simple multi layer perceptron application using feed forward back propagation algorithm

Geek Repo:Geek Repo

Github PK Tool:Github PK Tool

Feed Forward Back Propagation MLP Application

Simple multi layer perceptron application using feed forward backpropagation algorithm

Parametric Variables:

  • Hidden Layers
  • Neuron Layer Count for both input-output and hidden layers
  • Activation function can be either sigmoid (1) or tanh (2) or ReLu (3)
  • Learning Rate (default will be 0.5)
  • Procedure of Weight Update can be either delta bar (1) or adaptive learning (2) or momentum (3)
  • Epoch count

About

Simple multi layer perceptron application using feed forward back propagation algorithm

License:GNU General Public License v3.0


Languages

Language:Python 100.0%