Caffe/Neon prototxt training file for our Neurocomputing2017 work: Fuzzy Quantitative Deep Compression Network
-
Updated
May 30, 2018 - Python
Caffe/Neon prototxt training file for our Neurocomputing2017 work: Fuzzy Quantitative Deep Compression Network
attention pruning
Hierarchical Ensemble Pruning
Model optimizer used in Adlik.
Project code developed to accompany the thesis of the bachelor programme BSc Data Science and Artificial Intelligence taught @ Universiteit Maastricht. It consists in (re-)discovering Forbidden Minors for Treewidth, through a series of graph search/analysis techniques.
This repository has the porpouse of give a solution to the travelling sales man problem
[JCST 2023] "Inductive Lottery Ticket Learning for Graph Neural Networks" by Yongduo Sui, Xiang Wang, Tianlong Chen, Meng Wang, Xiangnan He, Tat-Seng Chua.
[AAAI 2024] Fluctuation-based Adaptive Structured Pruning for Large Language Models
Automated Identification of Redundant Layer Blocks for Pruning in Large Language Models
Official code for "EC-SNN: Splitting Deep Spiking Neural Networks on Edge Devices" (IJCAI2024)
Experiments for channel-based Structured Pruning Adapters
[PRL 2024] This is the code repo for our label-free pruning and retraining technique for autoregressive Text-VQA Transformers (TAP, TAP†).
KEN: Unleash the power of large language models with the easiest and universal non-parametric pruning algorithm
A curated list for Efficient Large Language Models
[NeurIPS 2023] LLM-Pruner: On the Structural Pruning of Large Language Models. Support LLaMA, Llama-2, BLOOM, Vicuna, Baichuan, etc.
[NAACL Findings 2024] Pruning as a Domain-specific LLM Extractor. Support LLaMA2.
Libraries for applying sparsification recipes to neural networks with a few lines of code, enabling faster and smaller models
Add a description, image, and links to the pruning-algorithms topic page so that developers can more easily learn about it.
To associate your repository with the pruning-algorithms topic, visit your repo's landing page and select "manage topics."