Pruning neural networks directly with back-propagation
-
Updated
Jul 21, 2021 - Python
Pruning neural networks directly with back-propagation
This repository contains a Pytorch implementation of the article "The Lottery Ticket Hypothesis: Finding Sparse, Trainable Neural Networks" and an application of this hypothesis to reinforcement learning
Reimplementation of Sparse Variational Dropout in Keras-Core/Keras 3.0
Robustness-Reinforced Knowledge Distillation with Correlation Distance and Network Pruning, IEEE Transactions on Knowledge and Data Engineering 2024
Code for the project "SNIP: Single-Shot Network Pruning"
The official code for our ACCV2022 poster paper: Network Pruning via Feature Shift Minimization.
Implementation of Autoslim using Tensorflow2
[ICCV 2017] Learning Efficient Convolutional Networks through Network Slimming
Sparse variational droput in tensorflow2
Pytorch implementation of our paper (TNNLS) -- Pruning Networks with Cross-Layer Ranking & k-Reciprocal Nearest Filters
Channel-Prioritized Convolutional Neural Networks for Sparsity and Multi-fidelity
Reducing the computational overhead of Deep CNNs through parameter pruning and tensor decomposition.
[ICLR'23] Trainability Preserving Neural Pruning (PyTorch)
Cheng-Hao Tu, Jia-Hong Lee, Yi-Ming Chan and Chu-Song Chen, "Pruning Depthwise Separable Convolutions for MobileNet Compression," International Joint Conference on Neural Networks, IJCNN 2020, July 2020.
[NIPS 2016] Learning Structured Sparsity in Deep Neural Networks
Improved Implementation of Single Shot MultiBox Detector, RefineDet and Network Optimization in Pytorch 07/2018
Lookahead: A Far-sighted Alternative of Magnitude-based Pruning (ICLR 2020)
In this repository using the sparse training, group channel pruning and knowledge distilling for YOLOV4,
Add a description, image, and links to the network-pruning topic page so that developers can more easily learn about it.
To associate your repository with the network-pruning topic, visit your repo's landing page and select "manage topics."