Skip to content

Youmin-Kim/Knowledge_Distillation

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

16 Commits
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

Knowledge Distillations with Pytorch

This repository intergrated various Knowledge Distillation methods. This implementation is based on these repositories:

Distillation Methods in this Repository

Dataset

  • CIFAR10, CIFAR100

Model

  • ResNet

Start Distillation

Requirements

  • Python3
  • PyTorch (> 1.0)
  • torchvision (> 0.2)
  • NumPy

Parser Variable description

  • type : dataset type (cifar10, cifar100)
  • model : network type (resnet, wideresnet)
  • depth : depth for resnet and wideresnet (teacher or baseline), sdepth : same for student network
  • wfactor : wide factor for wideresnet (teacher or baseline), swfactor : student network
  • tn : index number of the multiple trainings (teacher or baseline), stn : same for student network
  • distype : type of distillation method (KD, FN, NST, AT, RKD, SP, OD)

Baseline Training for teacher network or baseline student

  • ex) dataset : cifar100, model: resnet110, index of the number of trainings: 1
python3 ./train.py --type cifar100 --model resnet --depth 110 --tn 1

Start Distillation

  • Hyperparamters for each distillation method are fixed to same values on each original paper
  • ex) dataset : cifar100, teacher network : resnet110, teacher index : 1, student network : resnet20, student index : 1, index of the number of distillations: 1
python3 ./distill.py --type cifar100 --teacher resnet --student resnet --depth 110 --tn 1 --sdepth 20 --stn 1 --distype KD

About

No description, website, or topics provided.

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages