Libraries for applying sparsification recipes to neural networks with a few lines of code, enabling faster and smaller models
-
Updated
Jun 7, 2024 - Python
Libraries for applying sparsification recipes to neural networks with a few lines of code, enabling faster and smaller models
[NeurIPS 2023] LLM-Pruner: On the Structural Pruning of Large Language Models. Support LLaMA, Llama-2, BLOOM, Vicuna, Baichuan, etc.
A curated list for Efficient Large Language Models
Automated Identification of Redundant Layer Blocks for Pruning in Large Language Models
Model optimizer used in Adlik.
[AAAI 2024] Fluctuation-based Adaptive Structured Pruning for Large Language Models
Experiments for channel-based Structured Pruning Adapters
Hierarchical Ensemble Pruning
[NAACL Findings 2024] Pruning as a Domain-specific LLM Extractor. Support LLaMA2.
Caffe/Neon prototxt training file for our Neurocomputing2017 work: Fuzzy Quantitative Deep Compression Network
[JCST 2023] "Inductive Lottery Ticket Learning for Graph Neural Networks" by Yongduo Sui, Xiang Wang, Tianlong Chen, Meng Wang, Xiangnan He, Tat-Seng Chua.
This repository has the porpouse of give a solution to the travelling sales man problem
Official code for "EC-SNN: Splitting Deep Spiking Neural Networks on Edge Devices" (IJCAI2024)
attention pruning
Project code developed to accompany the thesis of the bachelor programme BSc Data Science and Artificial Intelligence taught @ Universiteit Maastricht. It consists in (re-)discovering Forbidden Minors for Treewidth, through a series of graph search/analysis techniques.
KEN: Unleash the power of large language models with the easiest and universal non-parametric pruning algorithm
[PRL 2024] This is the code repo for our label-free pruning and retraining technique for autoregressive Text-VQA Transformers (TAP, TAP†).
Add a description, image, and links to the pruning-algorithms topic page so that developers can more easily learn about it.
To associate your repository with the pruning-algorithms topic, visit your repo's landing page and select "manage topics."