YongHuaZhang-BUAA / awesome-pruning-acceleration

Incredible acceleration with pruning or the other compression techniques

Geek Repo:Geek Repo

Github PK Tool:Github PK Tool

awesome-pruning-acceleration Awesome

Hello visitors, I have been interested in an efficient deep neural network design, such as pruning, AutoML, quantization, and focused on the knowledge distillation for successful network generalization. This page organizes for pruning.

History

2021

Title Issue Release
Dynamic Slimmable Network CVPR GitHub
Robust Pruning at Initialization ICLR -

2020

Title Issue Release
What is the State of Neural Network Pruning? MLSys GitHub
The Generalization-Stability Tradeoff In Neural Network Pruning NeurIPS GitHub
Position-based Scaled Gradient for Model Quantization and Pruning NeurIPS GitHub
Sanity-Checking Pruning Methods: Random Tickets can Win the Jackpot NeurIPS GitHub
Movement Pruning: Adaptive Sparsity by Fine-Tuning NeurIPS GitHub
HYDRA: Pruning Adversarially Robust Neural Networks NeurIPS GitHub
Pruning Filter in Filter NeurIPS GitHub
Storage Efficient and Dynamic Flexible Runtime Channel Pruning via Deep Reinforcement Learning NeurIPS GitHub
Directional Pruning of Deep Neural Networks NeurIPS GitHub
SCOP: Scientific Control for Reliable Neural Network Pruning NeurIPS -
Neuron-level Structured Pruning using Polarization Regularizer NeurIPS GitHub
Pruning neural networks without any data by iteratively conserving synaptic flow NeurIPS GitHub
Neuron Merging: Compensating for Pruned Neurons NeurIPS GitHub
Filter Pruning and Re-Initialization via Latent Space Clustering IEEE Access -
TF-NAS: Rethinking Three Search Freedoms of Latency-Constrained Differentiable Neural Architecture Search ECCV GitHub
Differentiable Joint Pruning and Quantization for Hardware Efficiency ECCV -
DA-NAS: Data Adapted Pruning for Efficient Neural Architecture Search ECCV -
Accelerating CNN Training by Pruning Activation Gradients ECCV -
DHP: Differentiable Meta Pruning via HyperNetworks ECCV GitHub
DSA: More Efficient Budgeted Pruning via Differentiable Sparsity Allocation ECCV GitHub
EagleEye: Fast Sub-net Evaluation for Efficient Neural Network Pruning ECCV GitHub
PCONV: The Missing but Desirable Sparsity in DNN Weight Pruning for Real-time Execution on Mobile Devices AAAI -
Dynamic Network Pruning with Interpretable Layerwise Channel Selection AAAI -
Reborn Filters: Pruning Convolutional Neural Networks with Limited Data AAAI -
Channel Pruning Guided by Classification Loss and Feature Importance AAAI -
Pruning from Scratch AAAI -
DropNet: Reducing Neural Network Complexity via Iterative Pruning ICML GitHub
Operation-Aware Soft Channel Pruning using Differentiable Masks ICML -
Group Sparsity: The Hinge Between Filter Pruning and Decomposition for Network Compression CVPR GitHub
APQ: Joint Search for Network Architecture, Pruning and Quantization Policy CVPR -
Learning Filter Pruning Criteria for Deep Convolutional Neural Networks Acceleration CVPR -
Structured Compression by Weight Encryption for Unstructured Pruning and Quantization CVPR -
Multi-Dimensional Pruning: A Unified Framework for Model Compression CVPR -
DMCP: Differentiable Markov Channel Pruning for Neural Networks CVPR -
HRank: Filter Pruning using High-Rank Feature Map CVPR GitHub
Neural Network Pruning with Residual-Connections and Limited-Data CVPR -
Picking Winning Tickets Before Training by Preserving Gradient Flow ICLR GitHub
Provable Filter Pruning for Efficient Neural Networks ICLR GitHub
Data-Independent Neural Pruning via Coresets ICLR -
Lookahead: A Far-sighted Alternative of Magnitude-based Pruning ICLR GitHub
Dynamic Model Pruning with Feedback ICLR -
One-shot Pruning of Recurrent Neural Neworks by Jacobian Spectrum Evaluation ICLR -
A Signal Propagation Perspective for Pruning Neural Networks at Initialization ICLR GitHub

2019

Title Issue Release
MetaPruning: Meta Learning for Automatic Neural Network Channel Pruning ICCV GitHub
Variational Convolutional Neural Network Pruning CVPR -
Filter Pruning via Geometric Median for Deep Convolutional Neural Networks Acceleration CVPR GitHub
Towards Optimal Structured CNN Pruning via Generative Adversarial Learning (GAL) CVPR GitHub
Network Pruning via Transformable Architecture Search NeurIPS GitHub
Gate Decorator: Global Filter Pruning Method for Accelerating Deep Convolutional Neural Networks NeurIPS GitHub
Global Sparse Momentum SGD for Pruning Very Deep Neural Networks NeurIPS GitHub
The Lottery Ticket Hypothesis: Finding Sparse, Trainable Neural Networks ICLR -
Integral Pruning on Activations and Weights for Efficient Neural Networks ICLR -
SNIP: Single-Shot Network Pruning Based on Connection Sensitivity ICLR GitHub

2018

Title Issue Release
Rethinking the Smaller-Norm-Less-Informative Assumption in Channel Pruning of Convolution Layers ICLR GitHub
Clustering Convolutional Kernels to Compress Deep Neural Networks ECCV GitHub
NISP: Pruning Networks using Neuron Importance Score Propagation CVPR -
Soft Filter Pruning for Accelerating Deep Convolutional Neural Networks IJCAI GitHub
Accelerating convolutional networks via global & dynamic filter pruning IJCAI -

2017

Title Issue Release
Pruning Filters for Efficient ConvNets ICLR GitHub
Pruning Convolutional Neural Networks for Resource Efficient Inference ICLR GitHub
Designing Energy-Efficient Convolutional Neural Networks using Energy-Aware Pruning CVPR -
ThiNet: A Filter Level Pruning Method for Deep Neural Network Compression ICCV GitHub
Channel pruning for accelerating very deep neural networks ICCV GitHub
Learning Efficient Convolutional Networks Through Network Slimming ICCV GitHub
Scalpel: Customizing DNN Pruning to the Underlying Hardware Parallelism ISCA GitHub

2016

Title Issue Release
Deep Compression: Compressing Deep Neural Networks with Pruning, Trained Quantization and Huffman Coding ICLR -
Eyeriss: A Spatial Architecture for Energy-Efficient Dataflow for Convolutional Neural Networks ISCA -

2015

Title Issue Release
Learning both Weights and Connections for Efficient Neural Networks NeurIPS -

About

Incredible acceleration with pruning or the other compression techniques