Adversarial Example Defenses
A catalog of defenses against adversarial examples, paired with attacks that break them (where applicable).
See the live site at https://www.robust-ml.org/
This is a community-maintained document. Feel free to contribute by opening an issue or submitting a pull request.
More information is available in the FAQ: https://www.robust-ml.org/faq/
bundle install to fetch dependencies.
bundle exec jekyll build to build the site.
For development purposes, it can be handy to run
bundle exec jekyll serve --watch. You will be able to preview the website at http://localhost:4000.
Licensed under CC BY-SA 4.0.