fredericoschardong / ffnet-single-hidden-layer-hyper-parameterization

Python implementation that explores how different parameters impact a single hidden layer of a feed-forward neural network using gradient descent

Geek Repo:Geek Repo

Github PK Tool:Github PK Tool

Feed Forward Neural Network Single Hidden Layer Hyper Parameterization

Python implementation that explores how different parameters impact a feed-forward neural network with a single hidden layer.

A brief analysis of the results is provided in Portuguese. It was submitted as an assignment of a graduate course named Connectionist Artificial Intelligence at UFSC, Brazil.

In short, sine and cosine are fed to the FF network which tries to learn and predict their output. Different amounts of neurons, test subjects, learning rate, epochs, and activation functions are testes separately. They all use gradient descent.

The base case uses 10 neurons in the hidden layer, 200 instances for training, 20000 epochs, 0.005 learning rate, and some noise: alt text

The result folder holds the results of other scenarios where the different amount of neurons, training instances, epochs, learning rates, and noise are tested.

About

Python implementation that explores how different parameters impact a single hidden layer of a feed-forward neural network using gradient descent

License:MIT License


Languages

Language:Python 100.0%