HugoRodriguesQW / neuron

A feedforward neural network that can learn some simple tasks enveloped to be easily used and facilitate the evolution of generations.

Geek Repo:Geek Repo

Github PK Tool:Github PK Tool

Activation Functions

HugoRodriguesQW opened this issue · comments

Activation Functions

The activation function is applied to each neuron in the neural network and defines how the neuron will process its input and generate its output. It introduces non-linearity into the network, allowing the network to learn complex relationships and patterns in the data. Without a non-linear activation function, the neural network would be equivalent to a single linear layer, making it incapable of solving non-linear problems.

Activation functions transform the weighted sum of neuron input values ​​(also called raw activation) into a non-linear output. Some examples of common activation functions are sigmoid, Tanh, ReLU, Leaky ReLU, among others. Each activation function has its own characteristics and affects how neurons activate or inhibit their connections in response to different input patterns.
-by chatGPT

ReLU activation was built in and enabled deeper learning. With only 1600 interactions, she was able to discern characters written in 5x5 px frames. Of course this might be a poor result when compared to other AIs, but I can't help saying:

"this is awesome!"