Skip to content

junfeizhuang/Knowledge-distillation-example

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

19 Commits
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

Knowledge-distillation-example

Simple code using pytorch to realize part of Knowledge-distillation.

Support

BackBone

For KD and AT, ResNet20 is student network and ResNet56 is teacher Network.

For DML, two student networks are ResNet20.

Train

python train.py -m student -gpu 1

Dataset

Cifar10

Result

metric Raw ResNet20 Raw ResNet56 KD AT DML
Top-1 91.030 92.257 91.723 91.822 91.574

About

Simple pytorch code for knowledge distillation

Topics

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages