Skip to content

ICCV2019-advPattern: Physical-World Attacks on Deep Person Re-Identification via Adversarially Transformable Patterns

Notifications You must be signed in to change notification settings

AlirezaRahimpour/AdvPatternDeepREID

 
 

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

8 Commits
 
 
 
 
 
 
 
 

Repository files navigation

advPattern: Physical-World Attacks on Deep Person Re-Identification via Adversarially Transformable Patterns

image

Person re-identification (re-ID) is the task of matching person images across camera views, which plays an important role in surveillance and security applications. Inspired by great progress of deep learning, deep re-ID models began to be popular and gained state-of-the-art performance. However, recent works found that deep neural networks (DNNs) are vulnerable to adversarial examples, posing potential threats to DNNs based applications. This phenomenon throws a serious question about whether deep re-ID based systems are vulnerable to adversarial attacks. In this paper, we take the first attempt to implement robust physical-world attacks against deep re-ID. We propose a novel attack algorithm, called advPattern, for generating adversarial patterns on clothes, which learns the variations of image pairs across cameras to pull closer the image features from the same camera, while pushing features from different cameras farther. By wearing our crafted "invisible cloak", an adversary can evade person search, or impersonate a target person to fool deep re-ID models in physical world. We evaluate the effectiveness of our transformable patterns on adversaries' clothes with Market1501 and our established PRCS dataset. The experimental results show that the rank-1 accuracy of re-ID models for matching the adversary decreases from 87.9% to 27.1% under Evading Attack. Furthermore, the adversary can impersonate a target person with 47.1% rank-1 accuracy and 67.9% mAP under Impersonation Attack. The results demonstrate that deep re-ID systems are vulnerable to our physical attacks.

Paper: advPattern: Physical-World Attacks on Deep Person Re-Identification via Adversarially Transformable Patterns

Citation

Our paper has been accepted to appear in ICCV 2019. Please cite our paper if you're interested in our work!


@InProceedings{Wang_2019_ICCV,
author = {Wang, Zhibo and Zheng, Siyan and Song, Mengkai and Wang, Qian and Rahimpour, Alireza and Qi, Hairong},
title = {advPattern: Physical-World Attacks on Deep Person Re-Identification via Adversarially Transformable Patterns},
booktitle = {Proceedings of the IEEE/CVF International Conference on Computer Vision (ICCV)},
month = {October},
year = {2019}
}

About

ICCV2019-advPattern: Physical-World Attacks on Deep Person Re-Identification via Adversarially Transformable Patterns

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages

  • Python 100.0%