Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

cifar10 quality much worse than DDPM paper #32

Closed
richardrl opened this issue Jun 23, 2022 · 4 comments
Closed

cifar10 quality much worse than DDPM paper #32

richardrl opened this issue Jun 23, 2022 · 4 comments
Assignees

Comments

@richardrl
Copy link

richardrl commented Jun 23, 2022

Here are my cifar10 32x32 results, with 7x NVIDIA GeForce RTX 2080 Ti with 11GB VRAM trained for ~10 hours with:

python3 -m torch.distributed.run --nproc_per_node 7 train_unconditional.py --dataset="cifar10" --resolution=32 --output_dir="cifar10-ddpm-" --batch_size=16 --num_epochs=100 --gradient_accumulation_steps=1 --lr=1e-4 --warmup_steps=500

cifar10-ddpm.zip

The quality is worse than DDPM paper, but also according to my fellow researcher it is worse than the lucidbrains repo. Perhaps there is still a bug or missing setting somewhere?

@patrickvonplaten
Copy link
Contributor

Thanks a lot for trying out the training @richardrl! That's super useful - do you happen to have some training logs by any chance as well?

@richardrl
Copy link
Author

Hi @patrickvonplaten, what do you mean training logs? I didn't do a hyperparameter sweep if that's what you are asking.

@patrickvonplaten
Copy link
Contributor

I thought maybe you have some trainings logs from your training with lucydrains repo or some loss curves from your training run here, but no worries if not :-)

@patrickvonplaten
Copy link
Contributor

Closing for now

Dango233 pushed a commit to Dango233/diffusers that referenced this issue Dec 5, 2022
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

3 participants