waterbearbee / rigl

End-to-end training of sparse deep neural networks with little-to-no performance loss.

Geek Repo:Geek Repo

Github PK Tool:Github PK Tool

Rigging the Lottery: Making All Tickets Winners

80% Sparse Resnet-50

Paper: https://arxiv.org/abs/1911.11134

Sparse Training Algorithms

In this repository we implement following dynamic sparsity strategies:

  1. SET: Implements Sparse Evalutionary Training (SET) which corresponds to replacing low magnitude connections randomly with new ones.

  2. SNFS: Implements momentum based training without sparsity re-distribution:

  3. RigL: Our method, RigL, removes a fraction of connections based on weight magnitudes and activates new ones using instantaneous gradient information.

And the following one-shot pruning algorithm:

  1. SNIP: Single-shot Network Pruning based on connection sensitivity prunes the least salient connections before training.

We have code for following settings:

  • Imagenet2012: TPU compatible code with Resnet-50 and MobileNet-v1/v2.
  • CIFAR-10 with WideResNets.
  • MNIST with 2 layer fully connected network.

Setup

First clone this repo.

git clone https://github.com/google-research/rigl.git
cd rigl

We use Neurips 2019 MicroNet Challenge code for counting operations and size of our networks. Let's clone the google_research repo and add current folder to the python path.

git clone https://github.com/google-research/google-research.git
mv google-research/ google_research/
export PYTHONPATH=$PYTHONPATH:$PWD

Now we can run some tests. Following script creates a virtual environment and installs the necessary libraries. Finally, it runs few tests.

bash run.sh

We need to activate the virtual environment before running an experiment. With that, we are ready to run some trivial MNIST experiments.

source env/bin/activate

python rigl/mnist/mnist_train_eval.py

Disclaimer

This is not an official Google product.

About

End-to-end training of sparse deep neural networks with little-to-no performance loss.

License:Apache License 2.0


Languages

Language:Python 99.7%Language:Shell 0.3%