SiamSlim implements a real-time single object tracking algorithm which can be deployed on board of UAVs, as presented in our paper (https://link.springer.com/article/10.1007/s11554-021-01190-z). The code is based on the PySOT and the SiamBAN.
We deploy the Siamese tracker on the UAVs, and then conduct flight tests on the outskirts of Beijing, China. Some images recorded in the flight test are shown in these figures. In various scenes and illumination conditions, our Siamese tracker tracks the objects smoothly. The pan tilt rotates by the directing of the tracker, to make the target near the image center.
Example 1 shows a demo of the application of Siamese tracker on the UAVs. Based on the proposed Siamese tracker, we make a program of two UAVs accompanying flight. One UAV flies at random and another follows it by directing of Siamese tracker. Images on the left figure are the first perspective of the following UAV and images on the right figure are the third perspective captured from the ground.
In the left figure, a person is tracked as he runs in random directions. In this process, the aspect ratio of the target changes accordingly and the tracker tackles the challenge. In the middle figure, a car with a high speed (about 60 km/h ) is tracked. In this process, the scale of the target changes accordingly and the tracker tackles the challenge. In the right figure, a car is tracked under low illumination. The tracker tackles the partial occlusion challenge and prompts the user that the target is lost when the target is fully occluded.
The raw results are in "./result/"
If you want to train or test on PC, please use the code in "./on_PC/". The code is based on the PySOT and the SiamBAN
cd on_PC
python cam.py
In order to deploy on TX2 conveniently, we compile the tracker into ".so" format library on TX2. If you want to test on TX2, please use the code in "./on_TX2/".
cd on_TX2
python cam.py