Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

What about the performance on the MNIST dataset #1

Closed
1999kevin opened this issue Apr 21, 2023 · 6 comments
Closed

What about the performance on the MNIST dataset #1

1999kevin opened this issue Apr 21, 2023 · 6 comments

Comments

@1999kevin
Copy link

What about the performance on the MNIST dataset? And waht about the GPU comsumption?

@thorinf
Copy link
Owner

thorinf commented Apr 21, 2023

I'm updating the readme now. Here is an output after 100 epochs:

If you add use the same noise for every iteration at sampling then the images do not switch between different numbers as much. This is the intended property of 'self-consistency' from the paper. Generations on the same trajectory producing the same sample. However the multi-step sampler algorithm in the paper does add different noise each iteration. Really up to you which you prefer.

The GPU consumption is pretty low, I have only 2GB of VRAM. It could probably be a lot better with a smaller NN - I think the UNet I've used is too big.

@1999kevin
Copy link
Author

Thanks for your quick answers. The performance is pretty good. I find that you use L2 loss. I guess this will also reduce the GPU consumption

@thorinf
Copy link
Owner

thorinf commented Apr 21, 2023

Here is sampling where the same noise is injected at every iteration.

Yeah I think L2 is fine for this task. The goal is the implementation is A) for me to learn, and B) create something minimal.

@1999kevin
Copy link
Author

1999kevin commented Apr 21, 2023

So appreciate of your implementation on MNIST!

@thorinf
Copy link
Owner

thorinf commented Apr 21, 2023

Please give the project a watch or a star if you can. Currently looking for work and a bit of Github exposure might help.

@1999kevin
Copy link
Author

Sure, I'm glad to help.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants