Skip to content
Implementation of "Tracking without bells and whistles” and the multi-object tracking "Tracktor"
Branch: master
Clone or download

Tracking without bells and whistles

This repository provides the implementation of our paper Tracking without bells and whistles (Philipp Bergmann, Tim Meinhardt, Laura Leal-Taixe) []. All results presented in our work were produced with this code.

In addition to our supplementary document, we provide an illustrative web-video-collection. The collection includes examplary Tracktor++ tracking results and multiple video examples to accompany our analysis of state-of-the-art tracking methods.

Visualization of Tracktor


  1. Clone and enter this repository:
git clone --recurse-submodules
cd tracking_wo_bnw
  1. Install packages for Python 3.6 in virtualenv:

    1. pip3 install -r requirements.txt
    2. Faster R-CNN + FPN: pip3 install -e src/fpn
    3. Faster R-CNN: pip3 install -e src/frcnn
    4. Tracktor: pip3 install -e .
    5. PyTorch 0.3.1 for CUDA 9.0: pip install
  2. Compile Faster R-CNN + FPN and Faster R-CNN:

    1. Make sure the nvcc compiler with CUDA 9.0 is working and all CUDA paths are set (in particular export CPATH=/usr/local/cuda-9.0/include).
    2. Compile with: sh src/fpn/fpn/
    3. Compile with: sh src/frcnn/frcnn/
    4. If compilation was not successful, check and issues of official Faster-RCNN repository for help.
  3. MOTChallenge data:

    1. Download MOT17Det, MOT16Labels, 2DMOT2015, MOT16-det-dpm-raw and MOT17Labels and place them in the data folder. As the images are the same for MOT17Det, MOT17 and MOT16 we only need one set of images for all three benchmarks.
    2. Unzip all the data by executing:
    unzip -d MOT17Det
    unzip -d MOT16Labels
    unzip -d 2DMOT2015
    unzip -d MOT16-det-dpm-raw
    unzip -d MOT17Labels
  4. Download object detector and re-identifiaction Siamese network weights and MOTChallenge result files for ICCV 2019:

    1. Download zip file from here.
    2. Extract in output directory.

Evaluate Tracktor++

In order to configure, organize, log and reproduce our computational experiments we structured our code with the Sacred framework. For a detailed explanation of the Sacred interface please read its documentation.

  1. Our Tracktor can be configured by changing the corresponding experiments/cfgs/tracktor.yaml config file. The default configuration runs Tracktor++ with the FPN object detector as described in the paper.

  2. Run Tracktor by executing:

python experiments/scripts/
  1. The results are logged in the corresponding output directory. To evaluate the results download and run the official MOTChallenge devkit.

Train and test object detector (Faster-RCNN + FPN)

We pretrained the object detector on PASCAL VOC and did an extensive hyperparameter cross-validation. The resulting training command is:

python voc_init_iccv19 --dataset mot_2017_train --net res101 --bs 2 --nw 4 --epochs 38 --save_dir weights --cuda --use_tfboard True --lr_decay_step 20 --pre_checkpoint weights/res101/pascal_voc_0712/v2/fpn_1_12.pth --pre_file weights/res101/pascal_voc_0712/v2/config.yaml

Test the provided object detector by executing:

python experiments/scripts/ voc_init_iccv19 --cuda --net res101 --dataset mot_2017_train --imdbval_name mot_2017_train --checkepoch 27

Training the re-identifaction Siamese network

  1. The training config file is located at experiments/cfgs/siamese.yaml.

  2. Start training by executing:

python experiments/scripts/


If you use this software in your research, please cite our publication:

    author    = {Philipp Bergmann and
                Tim Meinhardt and
                Laura Leal{-}Taix{\'{e}}},
    title     = {Tracking without bells and whistles},
    journal   = {CoRR},
    volume    = {abs/1903.05625},
    year      = {2019},
    url       = {},
    archivePrefix = {arXiv},
    eprint    = {1903.05625},
    timestamp = {Sun, 31 Mar 2019 19:01:24 +0200},
    biburl    = {},
    bibsource = {dblp computer science bibliography,}
You can’t perform that action at this time.