Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

About Train the model by 50 labeled data per class of CIFAR-100 dataset, 55 known classes in table1 #4

Open
jingyang2017 opened this issue Jan 17, 2022 · 0 comments

Comments

@jingyang2017
Copy link

jingyang2017 commented Jan 17, 2022

Thanks for the excellent work.
I have followed the readme to run python main.py --dataset cifar100 --num-labeled 50 --out Results --num-super 10 --arch wideresnet --lambda_oem 0.1 --lambda_socr 1.0 --batch-size 64 --lr 0.03 --expand-labels --seed 0 --mu 2
The final performance after epoch 512 is
openmix
Does the Table 1 include the best performance on test set or the mean of last 20 epochs?
If run with the amp, the result is
op
Is it normal?

@jingyang2017 jingyang2017 changed the title Abou the CIFAR100 running performance About the CIFAR100 performance in table1 Jan 17, 2022
@jingyang2017 jingyang2017 changed the title About the CIFAR100 performance in table1 About Train the model by 50 labeled data per class of CIFAR-100 dataset, 55 known classes in table1 Jan 17, 2022
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

1 participant