HugoRodriguesQW / neuron

A feedforward neural network that can learn some simple tasks enveloped to be easily used and facilitate the evolution of generations.

Geek Repo:Geek Repo

Github PK Tool:Github PK Tool

Gradient

HugoRodriguesQW opened this issue · comments

Gradients

Gradient, specifically gradient descent, is an optimization algorithm used to adjust neural network weights during training. The objective of the training is to minimize a loss function, which measures the difference between the outputs programmed by the network and the actual outputs of the training set. The gradient is derived from the loss function with respect to the network weights, and it points in the direction in which the loss function is serviced most quickly.

In the training process, the gradient is calculated in relation to the network weights and used to update these weights in order to minimize the loss function. The gradient determines the magnitude and direction of updating the weights in each training iteration. This allows the network to adjust to the training data and learn to perform a specific task for the quality being trained.
-by chatGPT

Gradient Descent (GD) and Stochastic Gradient Descent:

opt_comparison

Mini-batch

opt_part

Momentum

opt_momentum

Adam

opt_momentum

RMSprop

opt_momentum