Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

FID for GAN trained on MNIST #24

Closed
NagabhushanSN95 opened this issue Jul 24, 2019 · 4 comments
Closed

FID for GAN trained on MNIST #24

NagabhushanSN95 opened this issue Jul 24, 2019 · 4 comments

Comments

@NagabhushanSN95
Copy link

Hi,
I'm training a DC GAN on MNIST dataset. I want to compute FID for my model. In this case should I use the Inception network itself or should I use a different classifier trained on MNIST dataset?

@mhex
Copy link
Collaborator

mhex commented Jul 24, 2019

Hi,

the Inception model expects a 3-channel image. One solution is to triple the MNIST single channel and motify the generator to output in the same 3-channel space as well but then you're not generating the original MNIST images anymore. So yes, you better train a different classifier on the original data and use the statistics of one of the last layers, or train an autoencoder and use the bottleneck activations. The resulting FID however, depends on this specific classifier/autoencoder and is not really comparable with FIDs from other classifiers/autoencoders.

@NagabhushanSN95
Copy link
Author

Hi, Thank you for the clarification. Can you explain what is the intuition for selecting bottleneck activations from an autoencoder?

@mhex
Copy link
Collaborator

mhex commented Aug 1, 2019

The bottleneck activiations are lower dimensional representations of the input data. The hope is, that the bottleneck encodes most of the information of the input data in a lower dim and is therefore a good candidate for the FID statistics (mean, variance).

@NagabhushanSN95
Copy link
Author

Oh! yeah, that's cool. I'll try that. Thanks :)
Closing the issue

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants