titu1994 / keras_novograd

Keras implementation of NovoGrad

Geek Repo:Geek Repo

Github PK Tool:Github PK Tool

NovoGrad for Keras

Keras port of NovoGrad, from the paper Stochastic Gradient Methods with Layer-wise Adaptive Moments for Training of Deep Networks.

NovoGrad

The above image is from the paper. NovoGrad makes the optimizer more resilient to choice of initial learning rate as it behaves similarly to SGD but with gradient normalization per layer. It extends ND-Adam and also decouples weight decay from regularization. Also, it has only half the memory cost as compared to Adam, and similar memory requirements to SGD with Momentum. This allows larger models to be trained without compromizing training efficiency.

Usage

Add the novograd.py script to your project, and import it. Can be a dropin replacement for Adam Optimizer.

Note that NovoGrad also supports "AMSGrad"-like behaviour with the amsgrad=True flag.

from novograd import NovoGrad

optm = NovoGrad(lr=1e-2)

Requirements

  • Keras 2.2.4+ & Tensorflow 1.14+ (Only supports TF backend for now).
  • Numpy

About

Keras implementation of NovoGrad

License:MIT License


Languages

Language:Python 100.0%