Skip to content

Latest commit

 

History

History
39 lines (32 loc) · 1.31 KB

README.md

File metadata and controls

39 lines (32 loc) · 1.31 KB

Pixel-Wise Contrastive Distillation

This is a PyTorch implementation of the PCD paper.

image
@inproceedings{huang2023pixel,
  title={Pixel-Wise Contrastive Distillation},
  author={Huang, Junqiang and Guo, Zichao},
  booktitle={Proceedings of the IEEE/CVF International Conference on Computer Vision},
  pages={16359--16369},
  year={2023}
}

Preparation

Distillation

For single node distributed training:

python main_pcd.py {dataset_dir} --rank 0 --world-size 1 -md \
  --student-arch resnet18 --teacher-arch mocov3_r50 \
  --teacher-ckpt {checkpoint_path_of_teacher_model} --output-dir {output_dir}

For multi nodes distributed training:

# For main node
python main_pcd.py {dataset_dir} --rank 0 --world-size {number_of_nodes} -md \
  --student-arch resnet18 --teacher-arch mocov3_r50 \
  --teacher-ckpt {checkpoint_path_of_teacher_model} --output-dir {output_dir}

# For other nodes
python main_pcd.py {dataset_dir} --rank {index_of_current_node} --world-size {number_of_nodes} -md \
  --dist-url {ip_of_main_node} --student-arch resnet18 --teacher-arch mocov3_r50 \
  --teacher-ckpt {checkpoint_path_of_teacher_model} --output-dir {output_dir}

Models