PyTorch implementation of Self-distillation with Batch Knowledge Ensembling Improves ImageNet Classification on ImageNet.
- Install Python dependencies:
pip install -r requirements.txt
- Set up Python modules:
python setup.py develop --user
- Download ImageNet with an expected structure:
imagenet
|_ train
| |_ n01440764
| |_ ...
| |_ n15075141
|_ val
| |_ n01440764
| |_ ...
| |_ n15075141
|_ ...
- Create a directory containing symlinks:
mkdir -p pycls/datasets/data
- Symlink ImageNet:
ln -s /path/imagenet pycls/datasets/data/imagenet
sh tools/dist_train.sh <WORK_DIR> <CONFIG>
For example,
sh tools/dist_train.sh logs/resnet50_bake configs/resnet/BAKE-R-50-1x64d_dds_8gpu.yaml
Note: you could use tools/slurm_train.sh
for distributed training with multiple machines.
sh tools/dist_test.sh <CONFIG> <CKPT>
For example,
sh tools/dist_test.sh configs/resnet/BAKE-R-50-1x64d_dds_8gpu.yaml logs/resnet50_bake/checkpoints/model_epoch_0100.pyth
architecture | ImageNet top-1 acc. | config | download |
---|---|---|---|
ResNet-50 | 78.0 | config | model |
ResNet-101 | 79.3 | config | |
ResNet-152 | 79.6 | config | |
ResNeSt-50 | 79.4 | config | |
ResNeSt-101 | 80.4 | config | |
ResNeXt-101(32x4d) | 79.3 | config | |
ResNeXt-152(32x4d) | 79.7 | config | |
MobileNet-V2 | 72.0 | config | |
EfficientNet-B0 | 76.2 | config |
The code is modified from pycls.