Skip to content
Go to file

Latest commit


Git stats


Failed to load latest commit information.
Latest commit message
Commit time

This repository contains code for our ICLR 2020 paper "Certified Robustness for Top-k Predictions against Adversarial Perturbations via Randomized Smoothing".

Required python tool: numpy, scipy, statsmodels.

Code usage: is used to compute the certified radius for top-k predictions. is used to read the frequency file (each line contains the frequency of each label for an example, i.e., the frequencies of labels 1,2,...,c. Last column contains the label that we aim to certify. "cifar0.25.txt" is an example file) and save the results.

Given a base classifier f, Gaussian noise ε ~ N(0,σI2), and an example x, we sample n random noise from N(0,σI2), i.e., ε1, ε2, ... , εn. The frequency for label l can be computed as nl = ∑i Indicate( f( x + εi ) = l ), where Indicate is indicate function.

You can directly run: python3 --src cifar0.25.txt --dst result.txt --alpha 0.001 --sigma 0.25 --k 3 where result.txt is file that saves the result, which contains two columns. The first column contains the example id and the second column contains the certified radius. alpha, sigma, and k specifies the value of α, σ, and k in the paper (please refer to paper for details). When estimating the upper and lower bounds for the probabilities, we adopt SimuEM (please refer to our paper for details). We also ran the code and "result.txt" is the result file we obtained.

Note that when we conducted the experiments, we used the pre-trained models from previous work which can be found in this address:


If you use this code, please cite the following paper:

title={Certified Robustness for Top-k Predictions against Adversarial Perturbations via Randomized Smoothing},
author={Jinyuan Jia and Xiaoyu Cao and Binghui Wang and Neil Zhenqiang Gong},
booktitle={International Conference on Learning Representations},


No description, website, or topics provided.



No releases published


No packages published


You can’t perform that action at this time.