zhanglei1172 / APIB

Geek Repo:Geek Repo

Github PK Tool:Github PK Tool

Automatic Network Pruning via Information Bottleneck Minimization.

Model Pruning

1. VGG-16

pruning ratio (FLOPs): 60%

python main.py \
--model vgg16\
--dataset cifar10\
--target 126000000 \
--ckpt [pre-trained model dir] \
--data_path [dataset path]\
--omega 1\
--tolerance 0.01\
--alpha 5e-5
2. ResNet56

pruning ratio (FLOPs): 55%

python main.py \
--model resnet56\
--dataset cifar10\
--target 57000000 \
--ckpt [pre-trained model dir] \
--data_path [dataset path]\
--omega 1\
--tolerance 0.01\
--alpha 8e-4
3. ResNet110

pruning ratio (FLOPs): 63%

python main.py \
--model resnet110\
--dataset cifar10\
--target 94000000 \
--ckpt [pre-trained model dir] \
--data_path [dataset path]\
--omega 1\
--tolerance 0.01\
--alpha 8e-9
4. GoogLeNet

pruning ratio (FLOPs): 63%

python main.py \
--model googlenet\
--dataset cifar10\
--target 568000000 \
--ckpt [pre-trained model dir] \
--data_path [dataset path]\
--omega 1\
--tolerance 0.01\
--alpha 4e-8
5. ResNet50

pruning ratio (FLOPs): 62%

python main.py \
--model resnet50\
--dataset imagenet\
--target 1550000000 \
--ckpt [pre-trained model dir] \
--data_path [dataset path]\
--omega 1\
--tolerance 0.01\
--alpha 7e-5

Model Training

1. VGG-16
python train.py \
--model vgg16\
--dataset cifar10\
--lr 0.1\
--batch_size 128 \
--ckpt_path [pruned model dir]\
--data_path [dataset path]
2. ResNet-50
python train.py \
--model resnet50\
--dataset imagenet\
--lr 0.025\
--batch_size 128 \
--ckpt_path [pruned model dir]\
--data_path [dataset path]

Pre-trained Models

Additionally, we provide the pre-trained models used in our experiments.

CIFAR-10:

Vgg-16 | ResNet56 | ResNet110
| GoogLeNet

CIFAR-100:

Vgg-16 | ResNet56

ImageNet:

ResNet50

Acknowledgments

Our implementation partially reuses Lasso's code | HRank's code | ITPruner's code.

About


Languages

Language:Python 100.0%