There are 0 repository under adamw topic.
Keras/TF implementation of AdamW, SGDW, NadamW, Warm Restarts, and Learning Rate multipliers
Implements https://arxiv.org/abs/1711.05101 AdamW optimizer, cosine learning rate scheduler and "Cyclical Learning Rates for Training Neural Networks" https://arxiv.org/abs/1506.01186 for PyTorch framework
Quasi Hyperbolic Rectified DEMON Adam/Amsgrad with AdaMod, Gradient Centralization, Lookahead, iterative averaging and decorrelated Weight Decay
pytorch implement of NovoGrad Optimizer
Pytorch implementation of lookahead optimizer(https://arxiv.org/pdf/1907.08610.pdf) and RAdam(https://arxiv.org/pdf/1908.03265.pdf)
Nadir: Cutting-edge PyTorch optimizers for simplicity & composability! 🔥🚀💻
Implementation of AdamW and AdamWR Algorithms in caffe.
Literature survey of convex optimizers and optimisation methods for deep-learning; made especially for optimisation researchers with ❤️
Kaggle's plant disease image classification competition. Finetuning pre-trained CNN models, loss functions, and optimizers in order to achieve better results.
Survey on performance between Ada-Hessian vs well-known first-order optimizers on MNIST & CIFAR-10 datasets
Super-Convergence on CIFAR10