Skip to content

Latest commit

 

History

History
22 lines (18 loc) · 1.03 KB

File metadata and controls

22 lines (18 loc) · 1.03 KB

Details TBD

Prepare requirements

pip install -r requirements.txt

Run self distillation

bash run_distillation.sh --topology=(resnet18|resnet34|resnet50|resnet101) --config=conf.yaml --output_model=path/to/output_model --dataset_location=path/to/dataset --use_cpu=(0|1)

CIFAR100 benchmark

https://github.com/weiaicunzai/pytorch-cifar100

Paper:

Be Your Own Teacher: Improve the Performance of Convolutional Neural Networks via Self Distillation

Self-Distillation: Towards Efficient and Compact Neural Networks

Our results in CIFAR100

model Baseline Classifier1 Classifier2 Classifier3 Classifier4 Ensemble
Resnet50 80.88 82.06 83.64 83.85 83.41 85.10