There are 0 repository under filter-pruning topic.
A research library for pytorch-based neural network pruning, compression, and more.
[ICLR'21] Neural Pruning via Growing Regularization (PyTorch)
Group Sparsity: The Hinge Between Filter Pruning and Decomposition for Network Compression. CVPR2020.
[NeurIPS'21 Spotlight] Aligned Structured Sparsity Learning for Efficient Image Super-Resolution (PyTorch)
GNN-RL Compression: Topology-Aware Network Pruning using Multi-stage Graph Embedding and Reinforcement Learning
Keras model convolutional filter pruning package
Code for CHIP: CHannel Independence-based Pruning for Compact Neural Networks (NeruIPS 2021).
Ensemble Knowledge Guided Sub-network Search and Fine-tuning for Filter Pruning
Cheng-Hao Tu, Jia-Hong Lee, Yi-Ming Chan and Chu-Song Chen, "Pruning Depthwise Separable Convolutions for MobileNet Compression," International Joint Conference on Neural Networks, IJCNN 2020, July 2020.
Official repository for the research article "Pruning vs XNOR-Net: A ComprehensiveStudy on Deep Learning for AudioClassification in Microcontrollers"
Filter pruning techniques of convolutional neural networks implemented with the Darknet framework.
Official Pytorch implementation of "Filter Pruning by Image Channel Reduction in Pre-Trained Convolutional Neural Networks".
code for your paper "Discrete cosine transform for filter pruning"
An easy way to conduct filter-pruning for Convolutional layers and fully connected layers
Constraint-Aware Importance Estimation for Global Filter Pruning under Multiple Resource Constraints (CVPRW2020)
对人像抠图模型MODNet进行滤波器级别的剪枝,结合自适应与固定比例策略。
In the human synaptic system, there are two important channels known as excitatory and inhibitory neurotransmitters that transmit a signal from a neuron to a cell. Adopting the neuroscientific perspective, we propose a synapse-inspired filter pruning method.
[Master Thesis] Research project at the Data Analytics Lab in collaboration with Daedalean AI. The thesis was submitted to both ETH Zürich and Imperial College London.