-
Data Science and Analytic Thrust, Information Hub, HKUST(GZ)
- GuangZhou
- https://www.zhihu.com/people/peijieDong
- https://pprp.github.io
- https://scholar.google.com/citations?user=TqS6s4gAAAAJ
Distill
Code for Teacher Guided Search for Architectures by Generation and Evaluation (TG-SAGE)
Multi-fidelity Neural Architecture Search with Knowledge Distillation
Distilling Knowledge via Knowledge Review, CVPR 2021
A PyTorch implementation for exploring deep and shallow knowledge distillation (KD) experiments with flexibility
[ICLR 2020] Contrastive Representation Distillation (CRD), and benchmark of recent knowledge distillation methods
A coding-free framework built on PyTorch for reproducible deep learning studies. PyTorch Ecosystem. 🏆26 knowledge distillation methods presented at TPAMI, CVPR, ICLR, ECCV, NeurIPS, ICCV, AAAI, etc…
Improving Convolutional Networks via Attention Transfer (ICLR 2017)
Pytorch implementation of various Knowledge Distillation (KD) methods.
Awesome Knowledge-Distillation. 分类整理的知识蒸馏paper(2014-2021)。
Awesome Knowledge Distillation
Awesome Knowledge-Distillation for CV
Implementation of CVPR 2019 paper: Distilling Object Detectors with Fine-grained Feature Imitation
Feature Fusion for Online Mutual Knowledge Distillation Code
PyTorch implementation for OD-cheap-convolution.
[ECCV'20 Oral] MutualNet: Adaptive ConvNet via Mutual Learning from Network Width and Resolution
MEAL V2: Boosting Vanilla ResNet-50 to 80%+ Top-1 Accuracy on ImageNet without Tricks. In NeurIPS 2020 workshop.
Official code for our ECCV'22 paper "A Fast Knowledge Distillation Framework for Visual Recognition"
[CVPR-2022] Official implementation for "Knowledge Distillation with the Reused Teacher Classifier".
Official implementations of CIRKD: Cross-Image Relational Knowledge Distillation for Semantic Segmentation and implementations on Cityscapes, ADE20K, COCO-Stuff., Pascal VOC and CamVid.
Code for ACL 2022 paper "BERT Learns to Teach: Knowledge Distillation with Meta Learning".
A research project on metaparameter optimization in knowledge distillation
Official implementation of paper "Knowledge Distillation from A Stronger Teacher", NeurIPS 2022
The repo for "Balanced Multimodal Learning via On-the-fly Gradient Modulation", CVPR 2022 (ORAL)
Focal and Global Knowledge Distillation for Detectors (CVPR 2022)





