Skip to content

[CVPR2019] Fast Online Object Tracking and Segmentation: A Unifying Approach

License

Notifications You must be signed in to change notification settings

JosieHong/SiamMask_on_Your_Own_Dataset

 
 

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

19 Commits
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

SiamMask on Your Own Dataset

Update: An easy way to training and testing SiamMask on your own dataset (e.g. SegTrack v2 Dataset)

PWC

This is the official implementation with training code for SiamMask (CVPR2019). For technical details, please refer to:

Fast Online Object Tracking and Segmentation: A Unifying Approach
Qiang Wang*, Li Zhang*, Luca Bertinetto*, Weiming Hu, Philip H.S. Torr (* denotes equal contribution)
CVPR 2019
[Paper] [Video] [Project Page]

Bibtex

If you find this code useful, please consider citing:

@inproceedings{wang2019fast,
    title={Fast online object tracking and segmentation: A unifying approach},
    author={Wang, Qiang and Zhang, Li and Bertinetto, Luca and Hu, Weiming and Torr, Philip HS},
    booktitle={Proceedings of the IEEE conference on computer vision and pattern recognition},
    year={2019}
}

Contents

  1. Environment Setup
  2. Demo
  3. Testing Models
  4. Training Models
  5. Train and Testing on Your Own Dataset

Environment setup

This code has been tested on Ubuntu 16.04, Python 3.6, Pytorch 0.4.1, CUDA 9.2, RTX 2080 GPUs

  • Clone the repository
git clone https://github.com/foolwood/SiamMask.git && cd SiamMask
export SiamMask=$PWD
  • Setup python environment
conda create -n siammask python=3.6
source activate siammask
pip install -r requirements.txt

# update
# install the previous pytorch from https://pytorch.org/get-started/previous-versions/
# e.g. conda install pytorch=0.4.1 cuda92 -c pytorch

bash make.sh
  • Add the project to your PYTHONPATH
export PYTHONPATH=$PWD:$PYTHONPATH

Demo

  • Setup your environment
  • Download the SiamMask model
cd $SiamMask/experiments/siammask_sharp
wget http://www.robots.ox.ac.uk/~qwang/SiamMask_VOT.pth
wget http://www.robots.ox.ac.uk/~qwang/SiamMask_DAVIS.pth
  • Run demo.py
cd $SiamMask/experiments/siammask_sharp
export PYTHONPATH=$PWD:$PYTHONPATH
python ../../tools/demo.py --resume SiamMask_DAVIS.pth --config config_davis.json

Testing

  • Setup your environment
  • Download test data
cd $SiamMask/data
sudo apt-get install jq
bash get_test_data.sh
  • Download pretrained models
cd $SiamMask/experiments/siammask_sharp
wget http://www.robots.ox.ac.uk/~qwang/SiamMask_VOT.pth
wget http://www.robots.ox.ac.uk/~qwang/SiamMask_VOT_LD.pth
wget http://www.robots.ox.ac.uk/~qwang/SiamMask_DAVIS.pth
  • Evaluate performance on VOT
bash test_mask_refine.sh config_vot.json SiamMask_VOT.pth VOT2016 0
bash test_mask_refine.sh config_vot.json SiamMask_VOT.pth VOT2018 0
bash test_mask_refine.sh config_vot.json SiamMask_VOT.pth VOT2019 0
bash test_mask_refine.sh config_vot18.json SiamMask_VOT_LD.pth VOT2016 0
bash test_mask_refine.sh config_vot18.json SiamMask_VOT_LD.pth VOT2018 0
python ../../tools/eval.py --dataset VOT2016 --tracker_prefix C --result_dir ./test/VOT2016
python ../../tools/eval.py --dataset VOT2018 --tracker_prefix C --result_dir ./test/VOT2018
python ../../tools/eval.py --dataset VOT2019 --tracker_prefix C --result_dir ./test/VOT2019
  • Evaluate performance on DAVIS (less than 50s)
bash test_mask_refine.sh config_davis.json SiamMask_DAVIS.pth DAVIS2016 0
bash test_mask_refine.sh config_davis.json SiamMask_DAVIS.pth DAVIS2017 0
bash test_mask_refine.sh config_davis.json SiamMask_DAVIS.pth ytb_vos 0

Results

These are the reproduction results from this repository. All results can be downloaded from our project page.

Tracker VOT2016
EAO / A / R
VOT2018
EAO / A / R
DAVIS2016
J / F
DAVIS2017
J / F
Youtube-VOS
J_s / J_u / F_s / F_u
Speed
SiamMask-box 0.412/0.623/0.233 0.363/0.584/0.300 - / - - / - - / - / - / - 77 FPS
SiamMask 0.433/0.639/0.214 0.380/0.609/0.276 0.713/0.674 0.543/0.585 0.602/0.451/0.582/0.477 56 FPS
SiamMask-LD 0.455/0.634/0.219 0.423/0.615/0.248 - / - - / - - / - / - / - 56 FPS

Note:

  • Speed are tested on a NVIDIA RTX 2080.
  • -box reports an axis-aligned bounding box from the box branch.
  • -LD means training with large dataset (ytb-bb+ytb-vos+vid+coco+det).

Training

Training Data

Download the pre-trained model (174 MB)

(This model was trained on the ImageNet-1k Dataset)

cd $SiamMask/experiments
wget http://www.robots.ox.ac.uk/~qwang/resnet.model
ls | grep siam | xargs -I {} cp resnet.model {}

Training SiamMask base model

  • Setup your environment
  • From the experiment directory, run
cd $SiamMask/experiments/siammask_base/
bash run.sh
  • Training takes about 10 hours in our 4 Tesla V100 GPUs.
  • If you experience out-of-memory errors, you can reduce the batch size in run.sh.
  • You can view progress on Tensorboard (logs are at <experiment_dir>/logs/)
  • After training, you can test checkpoints on VOT dataset.
bash test_all.sh -s 1 -e 20 -d VOT2018 -g 4  # test all snapshots with 4 GPUs
  • Select best model for hyperparametric search.
#bash test_all.sh -m [best_test_model] -d VOT2018 -n [thread_num] -g [gpu_num] # 8 threads with 4 GPUS
bash test_all.sh -m snapshot/checkpoint_e12.pth -d VOT2018 -n 8 -g 4 # 8 threads with 4 GPUS

Training SiamMask model with the Refine module

  • Setup your environment
  • In the experiment file, train with the best SiamMask base model
cd $SiamMask/experiments/siammask_sharp
bash run.sh <best_base_model>
bash run.sh checkpoint_e12.pth
  • You can view progress on Tensorboard (logs are at <experiment_dir>/logs/)
  • After training, you can test checkpoints on VOT dataset
bash test_all.sh -s 1 -e 20 -d VOT2018 -g 4

Training SiamRPN++ model (unofficial)

  • Setup your environment
  • From the experiment directory, run
cd $SiamMask/experiments/siamrpn_resnet
bash run.sh
  • You can view progress on Tensorboard (logs are at <experiment_dir>/logs/)
  • After training, you can test checkpoints on VOT dataset
bash test_all.sh -h
bash test_all.sh -s 1 -e 20 -d VOT2018 -g 4

Train and Testing on Your Own Dataset

Here we take SegTrack v2 Dataset as an example.

Download the dataset & Organize it as following

  1. Make the Annotations files. (Mask should be organized in Annotations as PIL mode 'P', which translates pixels through the palette.)
  2. Label meta.json manually.
  3. Crop & Generate data info.

The steps for SegTrack v2 Dataset can be seen at /data/SegTrackv2/readme.md.

|_Annotations
|  |_bird_of_paradise
|  |_...
|  |_worm
|_Code
|_crop511
|  |_bird_of_paradise
|  |_...
|  |_worm
|_GroundTruth
|_ImageSets
|_JPEGImages
|  |_bird_of_paradise
|  |_...
|  |_worm
|_meta.json
|_instances_train.json
|_instances_val.json
|_train.json

Rewrite API to your own dataset

At ./utils/benchmark_helper.py, load your own dataset.

Demo on your own dataset

Change the parameter --base_path to your own dataset.

cd SiamMask
export SiamMask=$PWD
mkdir demo

python ./tools/demo.py --resume ./experiments/siammask_sharp/SiamMask_SegTrack.pth \
--config ./experiments/siammask_sharp/config_davis.json \
--base_path ./data/SegTrackv2/JPEGImages/parachute

python ./tools/demo.py --resume ./experiments/siammask_sharp/SiamMask_SegTrack.pth \
--config ./experiments/siammask_sharp/config_davis.json \
--base_path ./data/tennis

demo

Testing & Training

For testing and training the refinement model, change the third parameter to your own dataset. Also, add your configure in ./experiments/siammask_sharp/config.json.

cd ./experiments/siammask_sharp

# tesing
bash test_mask_refine.sh config_davis.json SiamMask_DAVIS.pth SegTrackv2 0

# training the refinement mode
bash tune.sh SiamMask_SegTrack.pth SegTrackv2 0

Results

[2020-04-07 12:27:23,375-rk0-test.py#609] Segmentation Threshold 0.30 mIoU: 0.655
[2020-04-07 12:27:23,375-rk0-test.py#609] Segmentation Threshold 0.35 mIoU: 0.649
[2020-04-07 12:27:23,375-rk0-test.py#609] Segmentation Threshold 0.40 mIoU: 0.640
[2020-04-07 12:27:23,375-rk0-test.py#609] Segmentation Threshold 0.45 mIoU: 0.629
[2020-04-07 12:27:23,375-rk0-test.py#613] Mean Speed: 36.92 FPS

License

Licensed under an MIT license.

About

[CVPR2019] Fast Online Object Tracking and Segmentation: A Unifying Approach

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages

  • Python 81.0%
  • C 14.4%
  • C++ 2.5%
  • Shell 2.1%