There are 2 repositories under network-pruning topic.
[CVPR 2023] Towards Any Structural Pruning; LLMs / SAM / Diffusion / Transformers / YOLOv8 / CNNs
Rethinking the Value of Network Pruning (Pytorch) (ICLR 2019)
Collection of recent methods on (deep) neural network compression and acceleration.
This repository contains a Pytorch implementation of the paper "The Lottery Ticket Hypothesis: Finding Sparse, Trainable Neural Networks" by Jonathan Frankle and Michael Carbin that can be easily adapted to any model/dataset.
[TPAMI 2023, NeurIPS 2020] Code release for "Deep Multimodal Fusion by Channel Exchanging"
Code for "Co-Evolutionary Compression for Unpaired Image Translation" (ICCV 2019), "SCOP: Scientific Control for Reliable Neural Network Pruning" (NeurIPS 2020) and “Manifold Regularized Dynamic Network Pruning” (CVPR 2021).
(CVPR 2021, Oral) Dynamic Slimmable Network
Efficient Sparse-Winograd Convolutional Neural Networks (ICLR 2018)
[NeurIPS 2023] Structural Pruning for Diffusion Models
Code for "EigenDamage: Structured Pruning in the Kronecker-Factored Eigenbasis" https://arxiv.org/abs/1905.05934
SNIP: SINGLE-SHOT NETWORK PRUNING BASED ON CONNECTION SENSITIVITY
[TPAMI 2024] This is the official repository for our paper: ''Pruning Self-attentions into Convolutional Layers in Single Path''.
CAE-ADMM: Implicit Bitrate Optimization via ADMM-Based Pruning in Compressive Autoencoders
[Preprint] Why is the State of Neural Network Pruning so Confusing? On the Fairness, Comparison Setup, and Trainability in Network Pruning
Lookahead: A Far-sighted Alternative of Magnitude-based Pruning (ICLR 2020)
Tensorflow codes for "Rethinking the Smaller-Norm-Less-Informative Assumption in Channel Pruning of Convolution Layers"
In this repository using the sparse training, group channel pruning and knowledge distilling for YOLOV4,
[ICLR'23] Trainability Preserving Neural Pruning (PyTorch)
Cheng-Hao Tu, Jia-Hong Lee, Yi-Ming Chan and Chu-Song Chen, "Pruning Depthwise Separable Convolutions for MobileNet Compression," International Joint Conference on Neural Networks, IJCNN 2020, July 2020.
Pytorch implementation of our paper (TNNLS) -- Pruning Networks with Cross-Layer Ranking & k-Reciprocal Nearest Filters
Implementation of Autoslim using Tensorflow2
Improved Implementation of Single Shot MultiBox Detector, RefineDet and Network Optimization in Pytorch 07/2018
This repository contains a Pytorch implementation of the article "The Lottery Ticket Hypothesis: Finding Sparse, Trainable Neural Networks" and an application of this hypothesis to reinforcement learning
Reducing the computational overhead of Deep CNNs through parameter pruning and tensor decomposition.
[ICCV 2017] Learning Efficient Convolutional Networks through Network Slimming
The official code for our ACCV2022 poster paper: Network Pruning via Feature Shift Minimization.
Pruning neural networks directly with back-propagation