Skip to content
State-Of-The-Art ReID Baseline
Jupyter Notebook Python
Branch: master
Clone or download
Latest commit fd751ef Aug 15, 2019
Type Name Latest commit message Commit time
Failed to load latest commit information.
config release ready Aug 14, 2019
configs Finish ibn code Aug 13, 2019
csrc/eval_cylib code Jul 29, 2019
data release ready Aug 14, 2019
engine update readme Aug 15, 2019
layers Finish refactor code by fastai Apr 21, 2019
modeling release ready Aug 14, 2019
scripts update readme Aug 15, 2019
tests add new feature Aug 7, 2019
tools update readme Aug 15, 2019
utils code Jul 29, 2019
.gitignore update readme Aug 15, 2019 update code Aug 15, 2019
datasets update readme Aug 15, 2019 release ready Aug 14, 2019
vis_data.ipynb release ready Aug 14, 2019


A strong baseline (state-of-the-art) for person re-identification.

We support

  • easy dataset preparation
  • end-to-end training and evaluation
  • multi-GPU distributed training
  • fast training speed with fp16
  • support both image and video reid
  • multi-dataset training
  • cross-dataset evaluation
  • high modular management
  • state-of-the-art performance with simple model
  • high efficient backbone
  • advanced training techniques
  • various loss functions
  • visualization tools

Get Started

The designed architecture follows this guide PyTorch-Project-Template, you can check each folder's purpose by yourself.

  1. cd to folder where you want to download this repo

  2. Run git clone

  3. Install dependencies:

  4. Prepare dataset

    Create a directory to store reid datasets under this repo via

    cd reid_baseline
    mkdir datasets
    1. Download dataset to datasets/ from baidu pan or google driver
    2. Extract dataset. The dataset structure would like:
  5. Prepare pretrained model if you don't have

    from torchvision import models

    Then it will automatically download model in ~/.cache/torch/checkpoints/, you should set this path in config/ for all training or set in every single training config file in configs/.


Most of the configuration files that we provide, you can run this command for training market1501

bash scripts/

Or you can just run code below to modify your cfg parameters

python3 tools/ -cfg='configs/softmax.yml' INPUT.SIZE_TRAIN '(256, 128)' INPUT.SIZE_TEST '(256, 128)'


You can test your model's performance directly by running this command

python3 tools/ --config_file='configs/softmax.yml' TEST.WEIGHT '/save/trained_model/path'


cfg market1501 dukemtmc
softmax_triplet, size=(256, 128), batch_size=64(16 id x 4 imgs) 93.9 (85.9) training
You can’t perform that action at this time.