gunchagarg / differential-learning-rate-keras

Implementation of Differential Learning Rate in Keras

Geek Repo:Geek Repo

Github PK Tool:Github PK Tool

differential-learning-rate-keras

The phrase 'Differential Learning Rates' implies the use of different learning rates on different parts of the network.

alt text

Transfer Learning is a proven method to generate much better results in computer vision tasks. Most of the pretrained architectures (Resnet, VGG, inception, etc.) are trained on ImageNet and depending on the similarity of your data to the images on ImageNet, these weights will need to be altered more or less greatly. When it comes to modifying these weights, the last layers of the model will often need the most changing, while deeper levels that are already well trained to detecting basic features (such as edges and outlines) will need less.

dlr_implementation.py Modified source code of Adam optimizer to implement differential learning.

dlr_test.py Trains ResNet50 model on CIFAR 10 dataset with the use of differential learning.

About

Implementation of Differential Learning Rate in Keras


Languages

Language:Python 100.0%