Skip to content
/ UNIReID Public

Towards Modality-Agnostic Person Re-identification with Descriptive Query CVPR2023

Notifications You must be signed in to change notification settings

ccq195/UNIReID

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

20 Commits
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

Towards Modality-Agnostic Person Re-identification with Descriptive Query CVPR2023

Highlight

  1. This paper start the first attempts to investigate the modality-agnostic person re-identification with the descriptive query.
  2. This paper introduces a novel unified person re-identification (UNIReID) architecture based on a dual-encoder to jointly integrate cross-modal and multi-modal task learning. With task-specific modality learning and task-aware dynamic training, UNIReID enhances generalization ability across tasks and domains.
  3. This paper contributes three multi-modal ReID datasets to support unified ReID evaluation.

Dataset

Based on existing text-based datasets (CUHK-PEDES, ICFG-PEDES, and RSTPReid), we collect the sketches from photo modality to obtain multi-modality datasets (Tri-CUHK-PEDES, Tri-ICFG-PEDES, and Tri-RSTPReid). The collected sketches can be found in: https://pan.baidu.com/s/1c0h2utqisEx6OzGuoSaQhA (提取码: ndau) Google Drive(https://drive.google.com/file/d/12FIN-93Y4vXqVDVWLvLBwg3q0z0Vtwij/view?usp=sharing).

Citation

@inproceedings{chen2023towards, title={Towards Modality-Agnostic Person Re-identification with Descriptive Query}, author={Cuiqun Chen, Mang Ye, Ding Jiang}, booktitle={Conference on Computer Vision and Pattern Recognition 2023}, year={2023} }

Contact

chencuiqun@whu.edu.cn; yemang@whu.edu.cn.

About

Towards Modality-Agnostic Person Re-identification with Descriptive Query CVPR2023

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published