This is an implementation of a two-layer neural network using numpy.The training method is stochastic (online) gradient descent with momentum. It computes XOR for the given input. It uses two activation functions, one for each layer. One is a "tanh" function and the other is the sigmoid function. It uses cross-entropy as it's loss function.
- NumPy
In your terminal:
git clone https://github.com/harshitcodes/my_first_neural_network
- cd into the directory
- run
python two_layer_neural_net.py