pruning ratio (FLOPs): 60%
python main.py \
--model vgg16\
--dataset cifar10\
--target 126000000 \
--ckpt [pre-trained model dir] \
--data_path [dataset path]\
--omega 1\
--tolerance 0.01\
--alpha 5e-5
pruning ratio (FLOPs): 55%
python main.py \
--model resnet56\
--dataset cifar10\
--target 57000000 \
--ckpt [pre-trained model dir] \
--data_path [dataset path]\
--omega 1\
--tolerance 0.01\
--alpha 8e-4
pruning ratio (FLOPs): 63%
python main.py \
--model resnet110\
--dataset cifar10\
--target 94000000 \
--ckpt [pre-trained model dir] \
--data_path [dataset path]\
--omega 1\
--tolerance 0.01\
--alpha 8e-9
pruning ratio (FLOPs): 63%
python main.py \
--model googlenet\
--dataset cifar10\
--target 568000000 \
--ckpt [pre-trained model dir] \
--data_path [dataset path]\
--omega 1\
--tolerance 0.01\
--alpha 4e-8
pruning ratio (FLOPs): 62%
python main.py \
--model resnet50\
--dataset imagenet\
--target 1550000000 \
--ckpt [pre-trained model dir] \
--data_path [dataset path]\
--omega 1\
--tolerance 0.01\
--alpha 7e-5
python train.py \
--model vgg16\
--dataset cifar10\
--lr 0.1\
--batch_size 128 \
--ckpt_path [pruned model dir]\
--data_path [dataset path]
python train.py \
--model resnet50\
--dataset imagenet\
--lr 0.025\
--batch_size 128 \
--ckpt_path [pruned model dir]\
--data_path [dataset path]
Additionally, we provide the pre-trained models used in our experiments.
Vgg-16
| ResNet56
| ResNet110
| GoogLeNet
Our implementation partially reuses Lasso's code | HRank's code | ITPruner's code.