Awesome Knowledge Distillation
-
Updated
Jun 10, 2025
Awesome Knowledge Distillation
Images to inference with no labeling (use foundation models to train supervised models).
🚀 PyTorch Implementation of "Progressive Distillation for Fast Sampling of Diffusion Models(v-diffusion)"
Mechanistically interpretable neurosymbolic AI (Nature Comput Sci 2024): losslessly compressing NNs to computer code and discovering new algorithms which generalize out-of-distribution and outperform human-designed algorithms
PyTorch Implementation of Matching Guided Distillation [ECCV 2020]
Our open source implementation of MiniLMv2 (https://aclanthology.org/2021.findings-acl.188)
The Codebase for Causal Distillation for Language Models (NAACL '22)
AI Community Tutorial, including: LoRA/Qlora LLM fine-tuning, Training GPT-2 from scratch, Generative Model Architecture, Content safety and control implementation, Model distillation techniques, Dreambooth techniques, Transfer learning, etc for practice with real project!
A framework for knowledge distillation using TensorRT inference on teacher network
Repository for the publication "AutoGraph: Predicting Lane Graphs from Traffic"
A Segmentation-guided Box Teacher-student Approach For Weakly Supervised Road Segmentation
The Codebase for Causal Distillation for Task-Specific Models
A simple implementation of model distillation using PyTorch on the CIFAR-10 dataset.
Awesome Deep Model Compression
Use LLaMA to label data for use in training a fine-tuned LLM.
Autodistill Google Cloud Vision module for use in training a custom, fine-tuned model.
Use AWS Rekognition to train custom models that you own.
[Master Thesis] Research project at the Data Analytics Lab in collaboration with Daedalean AI. The thesis was submitted to both ETH Zürich and Imperial College London.
Zero-data blackbox machine translation model distillation / stealing
Model distillation of CNNs for classification of Seafood Images in PyTorch
Add a description, image, and links to the model-distillation topic page so that developers can more easily learn about it.
To associate your repository with the model-distillation topic, visit your repo's landing page and select "manage topics."