Zero-Shot Knowledge Distillation in Deep Networks in ICML2019
-
Updated
Jun 20, 2019 - Python
Zero-Shot Knowledge Distillation in Deep Networks in ICML2019
Knowledge Distillation using Tensorflow
Knowledge Transfer via Distillation of Activation Boundaries Formed by Hidden Neurons (AAAI 2019)
Knowledge Distillation with Adversarial Samples Supporting Decision Boundary (AAAI 2019)
Using Teacher Assistants to Improve Knowledge Distillation: https://arxiv.org/pdf/1902.03393.pdf
Code for the Knowledge distillation work to enhance fine grained disease recognition.
Simple pytorch code for knowledge distillation
Knowledge distillation methods implemented with Tensorflow (now there are 11 (+1) methods, and will be added more.)
An implementation of Frosst & Hinton's "Distilling a Neural Network Into a Soft Decision Tree"
My implementation of "Distilling the Knowledge in a Neural Network" on the CIFAR10 data set using Pytorch.
Awesome knowledge distillation codes.
In search of effective and efficient Pipeline for Distillating Knowledge in Convolutional Neural Networks
A large scale study of Knowledge Distillation.
Pipeline for training face recognition models (based on pytorch 1.1)
This is an attempt to transfer knowledge from one model to another with noise.
Official PyTorch implementation of "A Comprehensive Overhaul of Feature Distillation" (ICCV 2019)
PyTorch, PyTorch Lightning framework for trying knowledge distillation in image classification problems
Knowledge Distillation Toolkit
Add a description, image, and links to the knowledge-distillation topic page so that developers can more easily learn about it.
To associate your repository with the knowledge-distillation topic, visit your repo's landing page and select "manage topics."