Code for CPAL-2024 paper "Continual Learning with Dynamic Sparse Training: Exploring Algorithms for Effective Model Updates"
-
Updated
May 23, 2024 - Python
Code for CPAL-2024 paper "Continual Learning with Dynamic Sparse Training: Exploring Algorithms for Effective Model Updates"
Public code of the ML course
Research on improving the efficiency of finding lottery tickets.
Simple world models lead to good abstractions, Google Cerebra internship 2020/master thesis at EPFL LCN 2021 ⬛◼️▪️🔦
Use a meta-network to learn the importance and correlation of neural network weights
Sparseout: Controlling Sparsity in Deep Networks
[RepL4NLP(2021)] Hierarchical Sparse Variation Autoencoder (HSVAE)
Sliding Filter for AWGN Denoising
Molecular-property prediction with sparsity
[QCE 2023]"QuantumSEA: In-Time Sparse Exploration for Noise Adaptive Quantum Circuits" Tianlong Chen, Zhenyu Zhang, Hanrui Wang, Jiaqi Gu, Zirui Li, David Z Pan, Frederic T Chong, Song Han, Zhangyang Wang
[JCST 2023] "Inductive Lottery Ticket Learning for Graph Neural Networks" by Yongduo Sui, Xiang Wang, Tianlong Chen, Meng Wang, Xiangnan He, Tat-Seng Chua.
Official implementation of the paper "HyperSparse Neural Networks: Shifting Exploration to Exploitation through Adaptive Regularization"
Code for the paper "FOCIL: Finetune-and-Freeze for Online Class-Incremental Learning by Training Randomly Pruned Sparse Experts"
[Machine Learning Journal (ECML-PKDD 2022 journal track)] A Brain-inspired Algorithm for Training Highly Sparse Neural Networks
3D Loss Landscapes of SoftNet (Sparse Subnetwork)
Official PyTorch training code of Accelerating Deep Neural Networks via Semi-Structured Activation Sparsity (ICCV2023-RCV)
A supervised autoencoder with structured sparsity for efficient and informed clinical prognosis.
A pure implementation for sparse denoising autoencoder with adaptive evolutionary training using Scipy. The sparse implementation makes the algorithm scalable to high dimensional data and trainable on CPUs.
Add a description, image, and links to the sparsity topic page so that developers can more easily learn about it.
To associate your repository with the sparsity topic, visit your repo's landing page and select "manage topics."