Skip to content

PyTorch implementation of ARKitTrack for CVPR'2023 paper "ARKitTrack: A New Diverse Dataset for Tracking Using Mobile RGB-D Data", by Haojie Zhao, Junsong Chen, Lijun Wang, Huchuan Lu. Code will be released here.

License

lawrence-cj/ARKitTrack

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

22 Commits
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

[CVPR 2023] ARKitTrack: A New Diverse Dataset for Tracking Using Mobile RGB-D Data

teaser

This is a PyTorch implementation of the paper ARKitTrack: A New Diverse Dataset for Tracking Using Mobile RGB-D Data. Code will be released here.

Haojie Zhao*1, Junsong Chen*1, Lijun Wang1, Huchuan Lu1,2 (* indicates equal contributions)
1Dalian University of Technology, China, 2Peng Cheng Laboratory, China
Contact at: jschen@mail.dlut.edu.cn, haojie_zhao@mail.dlut.edu.cn

News

  • Code for VOS is coming soon ...
  • [2023/06/08] Release Train sets v1.
  • [2023/05/09] Release Test sets v1.
  • [2023/04/20] Release code for VOT.

Dataset


1. Installation

# 1. Clone this repo
git clone https://github.com/lawrence-cj/ARKitTrack.git
cd ARKitTrack

# 2. Create conda env
conda env create -f art_env.yml
conda activate art

# 3. Install mmcv-full, mmdet, mmdet3d for the BEV pooling, which is from bevfusion.
pip install openmim
mim install mmcv-full==1.4.0
mim install mmdet==2.20.0
python setup.py develop  # mmdet3d

2. Set project paths

Run the following command to set paths for this project.

python tracking/create_default_local_file.py --workspace_dir . --data_dir ./data --save_dir ./output

After running this command, you can also modify paths by editing these two files: lib/train/admin/local.py and lib/test/evaluation/local.py.

3. Evaluation

Download our trained models from Google Drive and uncompress them to output/checkpoints/.

Change the corresponding dataset paths in lib/test/evaluation/local.py.

Run the following command to test on different datasets.

python tracking/test.py --tracker art --param vitb_384_mae_ce_32x4_ep300 --dataset depthtrack --threads 2 --num_gpus 2
  • --config vitb_384_mae_ce_32x4_ep300 is used for cdtb and depthtrack.
  • --config vitb_384_mae_ce_32x4_ep300_art is used for arkittrack.
  • --debug 1 for visualization.
  • --dataset: [depthtrack, cdtb, arkit].

The raw results are stored in Google Drive.

4. Training

Download the pre-trained weights from Google Drive and uncompress it to pretrained_models/.

Change the corresponding dataset paths in lib/train/admin/local.py.

Run the following command to train for vot.

python tracking/train.py --script art --config vitb_384_mae_ce_32x4_ep300 --save_dir ./output --mode multiple --nproc_per_node 2
  • --config vitb_384_mae_ce_32x4_ep300: train with depthtrack, test on cdtb and depthtrack.
  • --config vitb_384_mae_ce_32x4_ep300_art: train with arkittrack, test on arkittrack.
  • You can modify the config yaml files for your own datasets.

Acknowledgments

Thanks for the OSTrack and BEVFusion projects, which help us to quickly implement our ideas.

Citation

@InProceedings{Zhao_2023_CVPR,
    author    = {Zhao, Haojie and Chen, Junsong and Wang, Lijun and Lu, Huchuan},
    title     = {ARKitTrack: A New Diverse Dataset for Tracking Using Mobile RGB-D Data},
    booktitle = {Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition (CVPR)},
    month     = {June},
    year      = {2023},
    pages     = {5126-5135}
}

License

This project is under the MIT license. See LICENSE for details.

About

PyTorch implementation of ARKitTrack for CVPR'2023 paper "ARKitTrack: A New Diverse Dataset for Tracking Using Mobile RGB-D Data", by Haojie Zhao, Junsong Chen, Lijun Wang, Huchuan Lu. Code will be released here.

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published