Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

PatchFool implementation #2163

Open
wants to merge 16 commits into
base: main
Choose a base branch
from
Open

PatchFool implementation #2163

wants to merge 16 commits into from

Commits on Dec 22, 2023

  1. Add initial PatchFool implementation

    Add a new evasion attack on vision transformers.
    
    Signed-off-by: Teodora Sechkova <tsechkova@vmware.com>
    sechkova committed Dec 22, 2023
    Configuration menu
    Copy the full SHA
    9c92350 View commit details
    Browse the repository at this point in the history
  2. Add ViT classifier

    Signed-off-by: Teodora Sechkova <tsechkova@vmware.com>
    sechkova committed Dec 22, 2023
    Configuration menu
    Copy the full SHA
    ec385c3 View commit details
    Browse the repository at this point in the history
  3. Fix pylint errors

    Signed-off-by: Teodora Sechkova <tsechkova@vmware.com>
    sechkova committed Dec 22, 2023
    Configuration menu
    Copy the full SHA
    57b54e4 View commit details
    Browse the repository at this point in the history
  4. Skip class token

    Skip the class token when calculating the most influential image
    patch.
    
    Signed-off-by: Teodora Sechkova <tsechkova@vmware.com>
    sechkova committed Dec 22, 2023
    Configuration menu
    Copy the full SHA
    a3ae044 View commit details
    Browse the repository at this point in the history
  5. Add preprocessing before feature extraction

    Signed-off-by: Teodora Sechkova <tsechkova@vmware.com>
    sechkova committed Dec 22, 2023
    Configuration menu
    Copy the full SHA
    c9cc837 View commit details
    Browse the repository at this point in the history
  6. Update classifier and algorithm steps

    Update classifier to use DeiT from the timm library.
    Fix algorithm details.
    
    Signed-off-by: Teodora Sechkova <tsechkova@vmware.com>
    sechkova committed Dec 22, 2023
    Configuration menu
    Copy the full SHA
    8f0ed1b View commit details
    Browse the repository at this point in the history
  7. Use NLL for attention loss

    - Calculate the attention loss as negative log likelihood
    - Clamp perturbations after random init
    
    Signed-off-by: Teodora Sechkova <tsechkova@vmware.com>
    sechkova committed Dec 22, 2023
    Configuration menu
    Copy the full SHA
    1f51b79 View commit details
    Browse the repository at this point in the history
  8. Fix normalisation

    - Fix input normalisation and scaling.
    - Fix patch application to happen only once after final iteration
    - Add skip_loss_att option
    
    Signed-off-by: Teodora Sechkova <tsechkova@vmware.com>
    sechkova committed Dec 22, 2023
    Configuration menu
    Copy the full SHA
    d0c4ba7 View commit details
    Browse the repository at this point in the history
  9. Add tqdm for the attack loop

    Use tqdm indication bar showing the attack iterations.
    
    Signed-off-by: Teodora Sechkova <tsechkova@vmware.com>
    sechkova committed Dec 22, 2023
    Configuration menu
    Copy the full SHA
    2c312a9 View commit details
    Browse the repository at this point in the history
  10. Move attention weights calculation to PyTorchEstimator

    - Move get_attention weights to PyTorchEstimator and generalise it
      by making return_nodes a list of strings provided by the user as an argument.
    
    - Define patch size on the attack side.
    - Remove PyTorchClassifierDeiT and reuse the exisitng PyTorchClassifier.
    
    Signed-off-by: Teodora Sechkova <tsechkova@vmware.com>
    sechkova committed Dec 22, 2023
    Configuration menu
    Copy the full SHA
    da6bb3a View commit details
    Browse the repository at this point in the history
  11. Update the attack parameters checks

    Signed-off-by: Teodora Sechkova <tsechkova@vmware.com>
    sechkova committed Dec 22, 2023
    Configuration menu
    Copy the full SHA
    9f6ad4b View commit details
    Browse the repository at this point in the history
  12. Update docstrings

    Signed-off-by: Teodora Sechkova <tsechkova@vmware.com>
    sechkova committed Dec 22, 2023
    Configuration menu
    Copy the full SHA
    71d51de View commit details
    Browse the repository at this point in the history
  13. Add verbose parameter

    Add verbose option for tqdm.
    Remove unused variable i_max_iter.
    
    Signed-off-by: Teodora Sechkova <tsechkova@vmware.com>
    sechkova committed Dec 22, 2023
    Configuration menu
    Copy the full SHA
    b8e6c28 View commit details
    Browse the repository at this point in the history
  14. Remove layer as an internal function parameter

    Use directly the attribute patch_layer.
    
    Signed-off-by: Teodora Sechkova <tsechkova@vmware.com>
    sechkova committed Dec 22, 2023
    Configuration menu
    Copy the full SHA
    a69d93f View commit details
    Browse the repository at this point in the history
  15. Add PatchFool attack example notebook

    Signed-off-by: Teodora Sechkova <tsechkova@vmware.com>
    sechkova committed Dec 22, 2023
    Configuration menu
    Copy the full SHA
    28a0ff2 View commit details
    Browse the repository at this point in the history
  16. Add PatchFool attack tests

    Signed-off-by: Teodora Sechkova <tsechkova@vmware.com>
    sechkova committed Dec 22, 2023
    Configuration menu
    Copy the full SHA
    da05de1 View commit details
    Browse the repository at this point in the history