GuideWsp / Gradient-Centralization

A New Optimization Technique for Deep Neural Networks

Geek Repo:Geek Repo

Github PK Tool:Github PK Tool

Gradient Centralization

Gradient Centralization (GC) is a simple and effective optimization technique for Deep Neural Networks (DNNs), which operates directly on gradients by centralizing the gradient vectors to have zero mean. It can both speedup training process and improve the final generalization performance of DNNs. GC is very simple to implement and can be easily embedded into existing gradient based DNN optimizers with only few lines of code. It can also be directly used to finetune the pre-trained DNNs.

Illustration of the GC operation on gradient matrix/tensor of weights in the fully-connected layer (left) and convolutional layer (right).

GC can be viewed as a projected gradient descent method with a constrained loss function. The Lipschitzness of the constrained loss function and its gradient is better so that the training process becomes more efficient and stable. Our experiments on various applications, including general image classification, fine-grained image classification, detection and segmentation and Person ReID demonstrate that GC can consistently improve the performance of DNN learning.

The optimizers are provided in the files: SGD.py, Adam.py and Adagrad.py, including SGD_GC, SGD_GCC, SGDW_GCC, Adam_GC, Adam_GCC, AdamW_GCC and Adagrad_GCC. The optimizers with "_GC" use GC for both Conv layers and FC layers, and the optimizers with "_GCC" use GC only for Conv layers. We can use the following codes to import SGD_GC:

from SGD import SGD_GC 

Update

  • 2020/04/07:Release a pytorch implementation of optimizers with GC, and provide some examples on classification task, including general image classification (Mini-ImageNet, CIFAR100 and ImageNet) and Fine-grained image classification (FGVC Aircraft, Stanford Cars, Stanford Dogs and CUB-200-2011).

Citation

@article{GradientCentra,
  title={Gradient-Centralization: A New Optimization Technique for Deep Neural Networks},
  author={Hongwei Yong and Jianqiang Huang and Xiansheng Hua and Lei Zhang},
  journal={Arxiv},
  year={2020}
}

Experiments

General Image Classification

Mini-ImageNet

The codes is in GC_code/Mini_ImageNet. The split dataset can be downloaded from here.

CIFAR100

The codes is in GC_code/CIFAR100.

ImageNet

The codes is in GC_code/ImageNet.

Fine-grained Image Classification

The codes is in GC_code/Fine-grained_classification. The preprocessed dataset can be downloaded from here.

Objection Detection and Segmentation

The codes is in GC_code/MMdetection.

Person ReId

The codes is in GC_code/PersonReId.

About

A New Optimization Technique for Deep Neural Networks


Languages

Language:Python 100.0%