Skip to content

allo-rene/pcd

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

14 Commits
 
 
 
 
 
 
 
 
 
 

Repository files navigation

Pixel-Wise Contrastive Distillation

This is a PyTorch implementation of the PCD paper.

image
@inproceedings{huang2023pixel,
  title={Pixel-Wise Contrastive Distillation},
  author={Huang, Junqiang and Guo, Zichao},
  booktitle={Proceedings of the IEEE/CVF International Conference on Computer Vision},
  pages={16359--16369},
  year={2023}
}

Preparation

Distillation

For single node distributed training:

python main_pcd.py {dataset_dir} --rank 0 --world-size 1 -md \
  --student-arch resnet18 --teacher-arch mocov3_r50 \
  --teacher-ckpt {checkpoint_path_of_teacher_model} --output-dir {output_dir}

For multi nodes distributed training:

# For main node
python main_pcd.py {dataset_dir} --rank 0 --world-size {number_of_nodes} -md \
  --student-arch resnet18 --teacher-arch mocov3_r50 \
  --teacher-ckpt {checkpoint_path_of_teacher_model} --output-dir {output_dir}

# For other nodes
python main_pcd.py {dataset_dir} --rank {index_of_current_node} --world-size {number_of_nodes} -md \
  --dist-url {ip_of_main_node} --student-arch resnet18 --teacher-arch mocov3_r50 \
  --teacher-ckpt {checkpoint_path_of_teacher_model} --output-dir {output_dir}

Models

About

PyTorch implementation of Pixel-Wise Contrastive Distillation: https://arxiv.org/abs/2211.00218

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages