Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Include python/pytorch version for MNIST reproducibility #25

Closed
latorrefabian opened this issue Aug 13, 2022 · 1 comment
Closed

Include python/pytorch version for MNIST reproducibility #25

latorrefabian opened this issue Aug 13, 2022 · 1 comment

Comments

@latorrefabian
Copy link

Hi! I am having a hard time reproducing the results (on MNIST, for example) and I have found that they differ when I change the pytorch version. I observe the following:

pytorch 1.12: when training with MNIST, training accuracy of 0.98 is achieved, but robust test accuracy is zero
pytorch 1.4: when training with MNIST, training accuracy of 0.95 is achieved, robust test accuracy is 0.88

I think the code was originally run with pytorch 1.0, I am trying to find out what is breaking the code in pytorch 1.12. It would be great to make it more clear which versions to use to reproduce the results

@latorrefabian latorrefabian changed the title Include python/pytorch version for reproducibility Include python/pytorch version for MNIST reproducibility Aug 14, 2022
@leslierice1
Copy link
Collaborator

I'm not sure why the results would be different with different pytorch versions, however I'm guessing that the model is catastrophically overfitting in the first case, resulting in zero robust test accuracy. I'd try checking during training whether this is occurring by checking the PGD accuracy on the first training batch of each epoch (i.e. see how we do this in the CIFAR10 training code here). I'd also try reducing the alpha parameter to avoid catastrophic overfitting if this is indeed what is occurring.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants