This repository provides the official PyTorch implementation of the following paper:
PointCAT: Contrastive Adversarial Training for Robust Point Cloud Recognition
Qidong Huang1, Xiaoyi Dong1, Dongdong Chen2, Hang Zhou3, Weiming Zhang1, Kui Zhang1, Gang Hua4, Nenghai Yu1
1University of Science and Technology of China, 2Microsoft Cloud AI, 3Simon Fraser University, 4Wormpex AI Research
This code is tested with Python3.7 and CUDA = 10.3, to setup a conda environment, please use the following instructions:
conda env create -f environment.yaml
conda activate pointcat
Download the aligned ModelNet40 dataset and ShapeNetPart dataset in their point cloud format and unzip them into your own dataset path. You can also run the bash script:
sh download.sh
the datasets will be downloaded at ./data
by default.
Download the pretrained models we provided for attack evaluation and unzip them at ./checkpoint
. The available models include
PointNet,
PointNet++,
DGCNN and
CurveNet.
You can directly evaluate our released pretrained models. For example, please run the following command for PointNet on ModelNet40:
python tester.py \
--data_path /PATH/TO/YOUR/DATASET/ \
--dataset ModelNet40 \
--defended_model pointnet_cls \
--batch_size 16 \
--mode test_normal \
--checkpoint_dir ./checkpoints/pointnet_pointcat_mn.pth
To implement training for PointNet, please run the following command:
python trainer.py \
--experiment_dir pn_test \
--data_path /PATH/TO/YOUR/DATASET/ \
--dataset ModelNet40 \
--defended_model pointnet_cls \
--eps 0.04 \
--alpha 8. \
--beta 0.5 \
--use_cosine_similarity \
--inner_loop_nums 4 \
--batch_size 64 \
--init_search_iters 500 \
--update_search_iters 10 \
--lr_fp 0.001 \
--use_multi_gpu
If you find this work useful for your research, please cite our paper:
@article{huang2022pointcat,
title={PointCAT: Contrastive Adversarial Training for Robust Point Cloud Recognition},
author={Huang, Qidong and Dong, Xiaoyi and Chen, Dongdong and Zhou, Hang and Zhang, Weiming and Zhang, Kui and Hua, Gang and Yu, Nenghai},
journal={arXiv preprint arXiv:2209.07788},
year={2022}
}
The code is released under MIT License (see LICENSE file for details).