- 0. Overview
- 1. When to prune
- 2. Learning and Pruning
- 3. Application
- 4. Combination
- 5. Survey of Pruning
- 6. Other Works
- Acknowledgements
The repo includes the ongoing updates of representative neural network pruning papers and open-source codes.
Our paper [A Survey on Deep Neural Network Pruning-Taxonomy, Comparison, Analysis, and Recommendations] (Paper Link) is under review.
Taxonomy: In our survey, we provide a comprehensive review of the state-of-the-art in deep neural network pruning, which we categorize along five orthogonal axes: Universal/Specific Speedup, When to Prune, Pruning Criteria, Learn to Prune, and Fusion of Pruning and Other Techniques.
Type | L |
F |
C |
N |
'H' | W |
P |
Other |
---|---|---|---|---|---|---|---|---|
Explanation | Layer pruning | Filter pruning | Channel pruning | Neuron pruning | Head pruning | Weight pruning | Pioneer | other types |
No. | Title | Venue | Type | Algorithm Name | Code | APP | Year |
---|---|---|---|---|---|---|---|
01 | A Fast Post-Training Pruning Framework for Transformers | NeurIPS | HF |
- | PyTorch(Author) | Natural Language Understanding | 2022 |
No. | Title | Venue | Type | Algorithm Name | Code | APP | Year |
---|---|---|---|---|---|---|---|
01 | SparseGPT: Massive Language Models Can be Accurately Pruned in One-Shot | NeurIPS | W |
- | PyTorch(Author) | Language Modeling | 2023 |
02 | Pruning Meets Low-Rank Parameter-efficient | arXiv | W |
LoRAPrune | - | Image Classification&Language Modeling | 2023 |
03 | LLM-Pruner: On the Structural Pruning of Large Language Models | arXiv | LHP |
LLM-Pruner | PyTorch(Author) | Language Modeling | 2023 |
04 | Parameter-Efficient Sparsity for Large Language Models Fine-Tuning | arXiv | W |
PST | PyTorch(Author) | Language Modeling | 2022 |
No. | Title | Venue | Type | Algorithm Name | Code | APP | Year |
---|---|---|---|---|---|---|---|
01 | Linear Mode Connectivity and the Lottery Ticket Hypothesis | ICML | W |
- | - | Image Classification | 2020 |
02 | When To Prune? A Policy Towards Early Structural Pruning | CVPR | F |
PaT | - | Image Classification | 2022 |
03 | Drawing Early-Bird Tickets: Towards More Efficient Training of Deep Networks | ICLR | W |
- | PyTorch(Author) | Image Classification | 2020 |
No. | Title | Venue | Type | Algorithm Name | Code | APP | Year |
---|---|---|---|---|---|---|---|
01 | Channel Gating Neural Networks | NeurIPS | F |
RNP | - | Image Classification | 2017 |
02 | Channel Gating Neural Networks | NeurIPS | C |
CGNet | PyTorch(Author) | Image Classification | 2019 |
03 | Dynamic Dual Gating Neural Networks | ICCV | C |
DGNet | PyTorch(Author) | Image Classification | 2021 |
04 | Manifold Regularized Dynamic Network Pruning | CVPR | F |
ManiDP | PyTorch(Author) | Image Classification | 2021 |
05 | Dynamic Channel Pruning: Feature Boosting and Suppression | ICLR | C |
FBS | PyTorch(Author) | Image Classification | 2019 |
06 | Frequency-Domain Dynamic Pruning for Convolutional Neural Networks | NeurIPS | F |
FDNP | - | Image Classification | 2019 |
07 | Fire Together Wire Together: A Dynamic Pruning Approach With Self-Supervised Mask Prediction | CVPR | F |
- | - | Image Classification | 2019 |
08 | Contrastive Dual Gating: Learning Sparse Features With Contrastive Learning | CVPR | WF |
CDG | - | Image Classification | 2022 |
No. | Title | Venue | Algorithm Name | Code | APP | Year |
---|---|---|---|---|---|---|
01 | Learning Bayesian Sparse Networks With Full Experience Replay for Continual Learning | CVPR | SNCL | - | Image Classification | 2022 |
02 | Continual Prune-and-Select: Class-Incremental Learning with SPecialized Subnetworks | Applied Intelligence | - | PyTorch(Author) | Image Classification | 2023 |
No. | Title | Venue | Algorithm Name | Code | APP | Year |
---|---|---|---|---|---|---|
01 | Studying the impact of magnitude pruning on contrastive learning methods | ICML | - | PyTorch(Author) | Image Classification | 2020 |
No. | Title | Venue | Algorithm Name | Code | APP | Year |
---|---|---|---|---|---|---|
01 | FedDUAP: Federated Learning with Dynamic Update and Adaptive Pruning Using Shared Data on the Server | IJCAI | FedDUAP | - | Image Classification | 2020 |
02 | Model Pruning Enables Efficient Federated Learning on Edge Devices | TNNLS | - | PyTorch(Author) | Image Classification | 2022 |
No. | Title | Venue | Code | APP | Year |
---|---|---|---|---|---|
01 | SuperTickets: Drawing Task-Agnostic Lottery Tickets from Supernets via Jointly Architecture Searching and Parameter Pruning | ECCV | PyTorch(Author) | Image Classification&Object Detection&Human Pose Estimation | 2022 |
02 | Training Neural Networks with Fixed Sparse Masks | NeurIPS | PyTorch(Author) | Image Classification | 2021 |
03 | Deep Rewiring: Training very Sparse Deep Networks | ICLR | - | Image Classification&Audio | 2018 |
04 | Co-Evolutionary Compression for Unpaired Image Translation | ICCV | PyTorch(Author) | Image Style Translation | 2019 |
05 | Content-Aware GAN Compression | CVPR | PyTorch(Author) | Image Style Translation | 2021 |
06 | Vision Transformer Slimming: Multi-Dimension Searching in Continuous Optimization Space | CVPR | PyTorch(Author) | Image Classification&Audio | 2022 |
No. | Title | Venue | Code | APP | Year |
---|---|---|---|---|---|
01 | A Fast Post-Training Pruning Framework for Transformers | NeurIPS | PyTorch(Author) | Natural Language Understanding | 2022 |
02 | The Lottery Ticket Hypothesis for Pre-trained BERT Networks | ICML | PyTorch(Author) | Language Modeling | 2021 |
03 | When BERT Plays the Lottery, All Tickets Are Winning | EMNLP | PyTorch(Author) | Language Modeling | 2020 |
04 | Structured Pruning Learns Compact and Accurate Models | ACL | PyTorch(Author) | Natural Language Understanding | 2022 |
05 | A Fast Post-Training Pruning Framework for Transformers | NeurIPS | PyTorch(Author) | Natural Language Understanding | 2022 |
06 | The Optimal BERT Surgeon: Scalable and Accurate Second-Order Pruning for Large Language Models | EMNLP | PyTorch(Author) | Natural Language Understanding | 2022 |
07 | Pruning Meets Low-Rank Parameter-efficient | arXiv | - | Image Classification&Language Modeling | 2023 |
08 | LLM-Pruner: On the Structural Pruning of Large Language Models | arXiv | - | Language Modeling | 2023 |
No. | Title | Venue | Code | APP | Year |
---|---|---|---|---|---|
01 | Exploring Sparsity in recurrent neural networks | ICLR | PyTorch | Speech Recognition | 2017 |
02 | Deep Rewiring: Training very Sparse Deep Networks | ICLR | - | Image Classification&Audio | 2018 |
No. | Title | Venue | Code | APP | Year |
---|---|---|---|---|---|
01 | Accelerating Sparse Deep Neural Networks | arXiv | - | Image Classification&Object Detection&Language Translation&Language Modeling&Image Synthesis&Domain Translation&Style Transfer&Image-Image Translation&Super Resolution | 2021 |
02 | LLM-Pruner: On the Structural Pruning of Large Language Models | arXiv | PyTorch | Causal Language Modeling | 2023 |
03 | Deep Model Compression Based on the Training History | arXiv | - | Image Classification | 2022 |
04 | OPQ: Compressing Deep Neural Networks with One-shot Pruning-Quantization | AAAI | - | Image Classification | 2021 |
No. | Title | Venue | Algorithm Name | Code | APP | Year |
---|---|---|---|---|---|---|
01 | Are All Layers Created Equal? | JMLR | - | - | Image Classification | 2022 |
02 | Is Pruning Compression?: Investigating Pruning Via Network Layer Similarity | WACV | - | - | Image Classification | 2020 |
03 | A Gradient Flow Framework For Analyzing Network Pruning | ICLR | - | PyTorch(Author) | Image Classification | 2021 |
https://github.com/airaria/TextPruner
We would like to express our gratitude to the authors of the articles cited in our survey and the authors of the following repositories.
https://github.com/he-y/awesome-Pruning/
https://github.com/MingSun-Tse/Awesome-Pruning-at-Initialization
https://github.com/csyhhu/Awesome-Deep-Neural-Network-Compression/blob/master/Paper/Pruning.md
If you find this project useful, please cite
@article{cheng2023survey,
title={A Survey on Deep Neural Network Pruning:Taxonomy, Comparison, Analysis, and Recommendations},
author={Hongrong Cheng and Miao Zhang and Javen Qinfeng Shi},
journal={arXiv preprint arXiv:2308.06767},
year={2023}
}