Skip to content

zj5559/DATr

Repository files navigation

DATr

PyTorch implementation of "Leveraging the Power of Data Augmentation for Transformer-based Tracking" (WACV2024).

Please find the paper here.

Introduction

In this paper, we perform systematic experiments to explore the impact of General Data Augmentations (GDA) on transformer trackers, including the pure transformer tracker and the hybrid CNN-Transformer tracker. Results below show GDAs have limited effects on SOTA trackers. DATR figure

Then, We propose two Data Augmentation methods based on challenges faced by Transformer-based trackers, DATr for short. They improve trackers from perspectives of adaptability to different scales, flexibility to boundary targets, and robustness to interference, respectively. DATR figure

Extensive experiments on different baseline trackers and benchmarks demonstrate the effectiveness and generalization of our DATr, especially for sequences with challenges and unseen classes. DATR figure

Installation

The environment installation and training configurations (like project path, pretrained models) are similar to the baseline trackers, e.g., OSTrack, please refer to OSTrack.

Training and Testing

Please see eval.sh to find the commands for training and testing.

Models and Results

Models and results can be found here.

Acknowledgments

Our work is mainly implemented on three different Transformer trackers, i.e., OSTrack, MixFormer, and STARK. Thanks for these concise and effective SOT frameworks.

About

PyTorch implementation of "Leveraging the Power of Data Augmentation for Transformer-based Tracking" (WACV2024)

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages