Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Unexpected CIFAR10 Results #21

Closed
maordikter1 opened this issue Jan 7, 2024 · 2 comments
Closed

Unexpected CIFAR10 Results #21

maordikter1 opened this issue Jan 7, 2024 · 2 comments

Comments

@maordikter1
Copy link

Hi!
First, thanks for your great work!
I followed your instructions and tried running labo_train on CIFAR10. However, I'm running into an issue with the results.
I ran labo_train on the dataset and received the following results:
1 shot: 86.34% on val, 85.39% on test.
2 shot: 90.64% on val, 89.68% on test.
4 shot: 91.04% on val, 91.10% on test.
8 shot: 93.44% on val, 92.9% on test.
16 shot: 94.84% on val, 94.80% on test.

However these results do not match the results in the paper, especially in the fewer shots.
Do you have any idea why this might be happening?

Thanks

@YueYANG1996
Copy link
Owner

How many epochs did you train? CIFAR-10 needs to be trained longer, and the validation accuracy continues to increase slowly (see figure below). Also, could you let me know if you changed any hyperparameters in the config file?
Screenshot 2024-01-07 at 9 28 59 AM

@maordikter1
Copy link
Author

Thanks for the quick response!
I think I solved the problem. The problem was in the additional code for preprocessing CIFAR10.
I deleted the last 2 lines that saves the splits, It seems that these lines were overwriting the provided splits and causing modifications.

Thanks

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants