Skip to content

strategist922/natural-adv-examples

 
 

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

16 Commits
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

Natural Adversarial Examples

We introduce natural adversarial examples -- real-world, unmodified, and naturally occurring examples that cause machine learning model performance to significantly degrade.

Download the natural adversarial example dataset ImageNet-A for image classifiers here.

Download the natural adversarial example dataset ImageNet-O for out-of-distribution detectors here.

Natural adversarial examples from ImageNet-A and ImageNet-O. The black text is the actual class, and the red text is a ResNet-50 prediction and its confidence. ImageNet-A contains images that classifiers should be able to classify, while ImageNet-O contains anomalies of unforeseen classes which should result in low-confidence predictions. ImageNet-1K models do not train on examples from “Photosphere” nor “Verdigris” classes, so these images are anomalous. Many natural adversarial examples lead to wrong predictions, despite having no adversarial modifications as they are examples which occur naturally.

Citation

If you find this useful in your research, please consider citing:

@article{hendrycks2021nae,
  title={Natural Adversarial Examples},
  author={Dan Hendrycks and Kevin Zhao and Steven Basart and Jacob Steinhardt and Dawn Song},
  journal={CVPR},
  year={2021}
}

About

A Harder ImageNet Test Set (CVPR 2021)

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages

  • Python 100.0%