There are 0 repository under adaptive-optimizer topic.
Quasi Hyperbolic Rectified DEMON Adam/Amsgrad with AdaMod, Gradient Centralization, Lookahead, iterative averaging and decorrelated Weight Decay
This repository contains the code and models for our paper "Investigating and Mitigating Failure Modes in Physics-informed Neural Networks(PINNs)"
The AFOF was developed to help Matlab users to obtain the optimal adaptive filters and their parameters for a specific application. To run this function, Signal Processing and DSP System Toolboxes are necessary. See the AFOF_user_guide PDF for instructions.
A novel optimizer that leverages the trend observed in the gradients (https://arxiv.org/pdf/2109.03820.pdf)
We introduce the new concept of (α,L,δ)-relative smoothness (see https://arxiv.org/pdf/2107.05765.pdf) which covers both the concept of relative smoothness and relative Lipschitz continuity. For the corresponding class of problems, we propose some adaptive and universal methods which have optimal estimates of the convergence rate.
Adaptive stochastic gradient method based on the universal gradient method. The universal method adjusts Lipsitz constant of the gradient on each step so that the loss function is majorated by the quadratic function.