CodeBreaker444 / optimistic-amsgrad-for-optmization-implementation-deeplearning

The implementation of the algorithm shows that OPTIMISTIC-AMSGRAD improves AMSGRAD in terms of various measures: training loss, testing loss, and classification accuracy on training/testing data over epochs.

Geek Repo:Geek Repo

Github PK Tool:Github PK Tool

[object Object]

Optimistic Adaptive Accelaration For Optimization on CIFAR-10 dataset🐶🐈🚘✈️ for image classification - Computer Vision

Predicting gradients beforehand will greatly reduce the number of epochs to be trained. Combining Optimistic Online Learning with adaptivity and the momentum to create the OPTIMISTIC-AMSGrad is a good idea. The implementation of the algorithm shows that OPTIMISTIC-AMSGRAD improves AMSGRAD in terms of various measures: training loss, testing loss, and classification accuracy on training/testing data over epochs. The basis of this algorithm is optimistic online learning. The basic idea behind online learning is to have a good guess over the loss function before choosing action and then the learner should exploit the guess to choose an action.

About

The implementation of the algorithm shows that OPTIMISTIC-AMSGRAD improves AMSGRAD in terms of various measures: training loss, testing loss, and classification accuracy on training/testing data over epochs.


Languages

Language:Python 100.0%