There are 2 repositories under learning-rate topic.
A learning rate range test implementation in PyTorch
Play deep learning with CIFAR datasets
Visualize Tensorflow's optimizers.
An easy neural network for Java!
Videos of deep learning optimizers moving on 3D problem-landscapes
Improving MMD-GAN training with repulsive loss function
PyTorch implementation of some learning rate schedulers for deep learning researcher.
Cyclic learning rate TensorFlow implementation.
One cycle policy learning rate scheduler in PyTorch
Stochastic Weight Averaging - TensorFlow implementation
Meta Transfer Learning for Few Shot Semantic Segmentation using U-Net
How optimizer and learning rate choice affects training performance
Benchmarking various Computer Vision models on TinyImageNet Dataset
sharpDARTS: Faster and More Accurate Differentiable Architecture Search
Improved Hypergradient optimizers, providing better generalization and faster convergence.
OneCycle LearningRateScheduler & Learning Rate Finder for TensorFlow 2.
Implementation of learning rate finder in TensorFlow
Pytorch implementation of arbitrary learning rate and momentum schedules, including the One Cycle Policy
Residual Network Experiments with CIFAR Datasets.
A packages containing all popular Learning Rate Schedulers. Implemented in Keras Tensorflow
Q-Learing algorithm solves simple mazes.
Detect happy dogs in real-time
A Jupyter notebook exploring sophisticated learning rate strategies for training deep neural networks
Tensorflow-Keras callback implementing arXiv 1712.07628
As the learning rate is one of the most important hyper-parameters to tune for training convolutional neural networks. In this paper, a powerful technique to select a range of learning rates for a neural network that named cyclical learning rate was implemented with two different skewness degrees. It is an approach to adjust where the value is cycled between a lower bound and upper bound. CLR policies are computationally simpler and can avoid the computational expense of fine tuning with fixed learning rate. It is clearly shown that changing the learning rate during the training phase provides by far better results than fixed values with similar or even smaller number of epochs.
Add-on functionality for the R implementation of Keras
All about machine learning
Reproduction of the "Don't Decay the Learning Rate, Increase the Batch Size" conference paper.
The main aim of this project is to built a predictive model using G Store data to predict the TOTAL REVENUE per customer that helps in better use of marketing budget.