Skip to content

RikoLi/PCL-CLIP

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

10 Commits
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

Prototypical Contrastive Learning-based CLIP Fine-tuning for Object Re-identification

PWC
PWC
PWC

[paper]

In this work, we propose a simple yet effective approach to adapt CLIP for supervised Re-ID, which directly fine-tunes the image encoder of CLIP using a Prototypical Contrastive Learning loss. Experimental results demonstrate the simplicity and competitiveness of our method compared to recent prompt-learning-based CLIP-ReID. Futhermore, our investigation indicates the essential consistency between the CLIP-ReID and our method.

Upload History

  • 2023/11/23: Full model (ID loss + CC loss) is ready.

Pipeline

pipeline

Installation

Install conda before installing any requirements.

conda create -n pclclip python=3.9
conda activate pclclip
pip install -r requirements.txt

Datasets

Make a new folder named data under the root directory. Download the datasets and unzip them into data folder.

Training

For example, training the full model on Market1501 with GPU 0 and saving the log file and checkpoints to logs/market-pclclip:

CUDA_VISIBLE_DEVICES=0 python train_pcl.py --config_file config/pcl-vit.yml DATASETS.NAMES "('market1501')" OUTPUT_DIR logs/market-pclclip

Configs can be modified from config/*.yaml files or from command line. If you want to add more config terms, update config/defaults.py. For other models using different losses, please modify the code according to the paper.

Acknowledgement

The code is implemented based on following works. We sincerely appreciate their great efforts on Re-ID research!

  1. TransReID
  2. CLIP-ReID
  3. ClusterContrast
  4. CAP
  5. O2CAP

Citation

@article{li2023prototypical,
  title={Prototypical Contrastive Learning-based CLIP Fine-tuning for Object Re-identification},
  author={Li, Jiachen and Gong, Xiaojin},
  journal={arXiv preprint arXiv:2310.17218},
  year={2023}
}

About

Code for "Prototypical Contrastive Learning-based CLIP Fine-tuning for Object Re-identification".

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages