Skip to content

NeelayS/Knowledge-Distillation

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

6 Commits
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

Knowledge-Distillation

Implementation of the paper Distilling the Knowledge in a Neural Network

Results on MNIST

Test set size = 10000

Model Accuracy
Teacher model 0.9847
Distilled student model 0.9810

Compression ratio = 2

About

Vanilla knowledge distillation in PyTorch

Topics

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages