This repo presents a collection of recent papers related to speech model compression. Please feel free to suggest other papers!
- [INTERSPEECH] [arXiv] Task-Agnostic Structured Pruning of Speech Representation Models
- [INTERSPEECH] [arXiv] [code] DPHuBERT: Joint Distillation and Pruning of Self-Supervised Speech Models
- [ICASSP] [arXiv] Structured Pruning of Self-Supervised Pre-Trained Models for Speech Recognition and Understanding
- [INTERSPEECH] [arXiv] [code] LightHuBERT: Lightweight and Configurable Speech Representation Learning with Once-for-All Hidden-Unit BERT
- [INTERSPEECH] [arXiv] Deep versus Wide: An Analysis of Student Architectures for Task-Agnostic Knowledge Distillation of Self-Supervised Speech Models
- [INTERSPEECH] [arXiv] [code] FitHuBERT: Going Thinner and Deeper for Knowledge Distillation of Speech Self-Supervised Learning
- [ICASSP] [arXiv] [code] DistilHuBERT: Speech Representation Learning by Layer-wise Distillation of Hidden-unit BERT