This is a scratch implementation of FFANN and can be customized easily for different datasets. At the moment the network is optimized on 16 hidden layers and works accurately on the mnist dataset. The weight vectors are randomly initialized and are dynamically updated as the network moves through different perceptrons. Sigmoid is being used as the activation function. Which is preferred for this dataset.
h_nodes = number of perceptrons in a hidden layer
h_layers = number of hidden layers
learningRate(self explainatory)
samples //