optimizer-using-numpy
Presentation
Here is the code associated with my blog post: https://aidri.github.io/emping/blog/Overview-optimization-algorithm.
I coded the 4 algorithms studied, that is to say :
- SGD
- SGD + momentum
- SGD + nesterov momentum
- Adam
The optimizers therefore try to find a global minimum, and thus avoid falling into a local minimum.
The function studied is the Beale function.
(The optimizers can change the global minimat, depending on their initialization position (because the global minimat is not really precise on this function)).
You can adapt these optimizers to your needs by modifying the function and derivatives.
You can also change the hyperparameters: e (or n), epsilon, beta, etc...