Skip to content
Multi-Adapter RGBT Tracking implementation on Pytorch
Python
Branch: master
Clone or download
Latest commit 02b6cb4 Jul 19, 2019
Permalink
Type Name Latest commit message Commit time
Failed to load latest commit information.
DATA Add files via upload Jul 19, 2019
modules Add files via upload Jul 19, 2019
pretrain Add files via upload Jul 19, 2019
tracking Add files via upload Jul 19, 2019
MANet-rgbt234.png Add files via upload Jul 18, 2019
README.md Update README.md Jul 19, 2019
readme.txt Add files via upload Jul 18, 2019

README.md

MANet

Multi-Adapter RGBT Tracking implementation on Pytorch

this code is update version based on submitted for VOT RGBT race code simplified version. So there are some differences from MANET's paper.

Prerequisites

CPU: Intel(R) Core(TM) i7-7700K CPU @ 3.75GHz GPU: NVIDIA GTX1080 Ubuntu 16.04

  • python2.7
  • pytorch == 0.3.1
  • numpy
  • PIL
  • by yourself need install some library functions

Pretrained model for MANet

In our tracker, we use an VGG-M Net variant as our backbone, which is end-to-end trained for visual tracking.

The train on gtot model file in models folder,name called MANet-2IC.pth ,you can use this tracking rgbt234

Then,You need to modify the path in the tracking/options.py file depending on where the file is placed. It is best to use an absolute path. you can change code version of CPU/GPU in this flie

Train

you can use RGBT dataset as train data , in pretrain floder you need first genrate sequence list .pkl file use prepro_data.py , sencod change your data path , fainlly excute train.py

pretrain model :https://drive.google.com/open?id=1aO6LhOTxmpd7o_JXPLPjL3LsrQ5oqbl7

Run tracker

in the tracking/run_tracker.py file you need change dataset path and save result file dirpath in the tracking/options.py file you need set model file path ,and set learning rate depend on annotation. in tracking and train stage you need update modules/MANet3x1x1_IC.py file depend on annotation.

tracking model:https://drive.google.com/open?id=1Png508G4kQPI6HNewKQ4cfS36CvoSFSN

Result

image

You can’t perform that action at this time.