Paper: https://arxiv.org/abs/1911.11134
In this repository we implement following dynamic sparsity strategies:
-
SET: Implements Sparse Evalutionary Training (SET) which corresponds to replacing low magnitude connections randomly with new ones.
-
SNFS: Implements momentum based training without sparsity re-distribution:
-
RigL: Our method, RigL, removes a fraction of connections based on weight magnitudes and activates new ones using instantaneous gradient information.
And the following one-shot pruning algorithm:
- SNIP: Single-shot Network Pruning based on connection sensitivity prunes the least salient connections before training.
We have code for following settings:
- Imagenet2012: TPU compatible code with Resnet-50 and MobileNet-v1/v2.
- CIFAR-10 with WideResNets.
- MNIST with 2 layer fully connected network.
First clone this repo.
git clone https://github.com/google-research/rigl.git
cd rigl
We use Neurips 2019 MicroNet Challenge code for counting operations and size of our networks. Let's clone the google_research repo and add current folder to the python path.
git clone https://github.com/google-research/google-research.git
mv google-research/ google_research/
export PYTHONPATH=$PYTHONPATH:$PWD
Now we can run some tests. Following script creates a virtual environment and installs the necessary libraries. Finally, it runs few tests.
bash run.sh
We need to activate the virtual environment before running an experiment. With that, we are ready to run some trivial MNIST experiments.
source env/bin/activate
python rigl/mnist/mnist_train_eval.py
This is not an official Google product.