Skip to content

This is the official code for the ESORICS 2024 paper "PointAPA: Towards Availability Poisoning Attacks in 3D Point Clouds".

Notifications You must be signed in to change notification settings

wxldragon/PointAPA

Repository files navigation

PointAPA

The official implementation of our ESORICS 2024 paper "PointAPA: Towards Availability Poisoning Attacks in 3D Point Clouds", by Xianlong Wang, Minghui Li, Peng Xu, Wei Liu, Leo Yu Zhang, Shengshan Hu, and Yanjun Zhang.

ESORICS 2024 Availability Poisoning Attacks 3D Point Clouds

Abstract

Recently, the realm of deep learning applied to 3D point clouds has witnessed significant progress, accompanied by a growing concern about the emerging security threats to point cloud models. While adversarial attacks and backdoor attacks have gained continuous attention, the potentially more detrimental availability poisoning attack (APA) remains unexplored in this domain. In response, we propose the first APA approach in 3D point cloud domain (PointAPA), which utilizes class-wise rotations to serve as shortcuts for poisoning, thus satisfying efficiency, effectiveness, concealment, and the black-box setting. Drawing inspiration from the prevalence of shortcuts in deep neural networks, we exploit the impact of rotation in 3D data augmentation on feature extraction in point cloud networks. This rotation serves as a shortcut, allowing us to apply varying degrees of rotation to training samples from different categories, creating effective shortcuts that contaminate the training process. The natural and efficient rotating operation makes our attack highly inconspicuous and easy to launch. Furthermore, our poisoning scheme is more concealed due to keeping the labels clean (i.e., clean-label APA). Extensive experiments on benchmark datasets of 3D point clouds (including real-world datasets for autonomous driving) have provided compelling evidence that our approach largely compromises 3D point cloud models, resulting in a reduction in model accuracy ranging from 40.6% to 73.1% compared to clean training. Additionally, our method demonstrates resilience against statistical outlier removal (SOR) and three types of random data augmentation defense schemes.

Latest Update

Date Event
2024/09/05 This official paper is online at PointAPA !
2024/06/03 We have released the implementation of PointAPA!
2024/03/19 PointAPA is acccepted by ESORICS 2024 (Spring Cycle)!

Start Running PointAPA

  • Get code
git clone https://github.com/wxldragon/PointAPA.git
  • Build environment
cd PointAPA
conda create -n PointAPA python=3.9
conda activate PointAPA
pip install -r requirements.txt
  • Download datasets

    • Please download ModelNet dataset at: [ModelNet], ShapeNetPart dataset at: [ShapeNetPart]
    • Unzip the datasets.zip files in PointAPA/clean_data
  • Generate PointAPA (i.e., poisoned) datasets

python poison_generation.py --dataset ModelNet10 --interval 42 
  • Perform clean training
python train.py --dataset ModelNet10 --target_model pointnet_cls
  • Perform poison training for evaluation of PointAPA's effectiveness
python train.py --dataset ModelNet10 --target_model pointnet_cls --poison_train --interval 42
  • Perform poison training under defenses for evaluation of PointAPA's robustness
python train.py --dataset ModelNet10 --target_model pointnet_cls --poison_train --interval 42 --defense --aug_type sca

BibTex

If you find PointAPA both interesting and helpful, please consider citing us in your research or publications:

@inproceedings{wang2024pointapa,
  title={PointAPA: Towards Availability Poisoning Attacks in 3D Point Clouds},
  author={Wang, Xianlong and Li, Minghui and Xu, Peng and Liu, Wei and Zhang, Leo Yu and Hu, Shengshan and Zhang, Yanjun},
  booktitle={Proceedings of the 29th European Symposium on Research in Computer Security (ESORICS'24)},
  year={2024}
}

About

This is the official code for the ESORICS 2024 paper "PointAPA: Towards Availability Poisoning Attacks in 3D Point Clouds".

Topics

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages