Skip to content

hassanalikhatim/CVP-Attack

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

15 Commits
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

Consistent Valid Physically-Realizable Adversarial Attack against Crowd-flow Prediction Models

This is the source code for our paper titled, "Consistent Valid Physically-Realizable Adversarial Attack against Crowd-flow Prediction Models".

This paper studies the robustness of the crowd flow prediction (CFP) models. We identify two properties of crowd flow states (CFS), which allow us to validate inputs to the CFP models during run time. On the positive side, the identified properties are used to develop an efficient defense mechanism that can effectively detect adversarial perturbations in the input and can be integrated with several CFP models irrespective of the baseline architecture. However, on the negative side, the identified properties can also be used to develop a consistent, valid, and physically plausible adversarial attack that remains undetected by the defense.

Citation

If you use this code in your work, please cite using the following BibTeX entry:

@ARTICLE{ali2023consistent,
  author={Ali, Hassan and Butt, Muhammad Atif and Filali, Fethi and Al-Fuqaha, Ala and Qadir, Junaid},
  journal={IEEE Transactions on Intelligent Transportation Systems}, 
  title={Consistent Valid Physically-Realizable Adversarial Attack Against Crowd-Flow Prediction Models}, 
  year={2023},
  volume={},
  number={},
  pages={1-16},
  doi={10.1109/TITS.2023.3343971}
}

About

No description, website, or topics provided.

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages