Details TBD
pip install -r requirements.txt
bash run_distillation.sh --topology=(resnet18|resnet34|resnet50|resnet101) --config=conf.yaml --output_model=path/to/output_model --dataset_location=path/to/dataset --use_cpu=(0|1)
https://github.com/weiaicunzai/pytorch-cifar100
Be Your Own Teacher: Improve the Performance of Convolutional Neural Networks via Self Distillation
Self-Distillation: Towards Efficient and Compact Neural Networks
model | Baseline | Classifier1 | Classifier2 | Classifier3 | Classifier4 | Ensemble |
---|---|---|---|---|---|---|
Resnet50 | 80.88 | 82.06 | 83.64 | 83.85 | 83.41 | 85.10 |