Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Code for Minibatch Discrimination on MNIST Missing #4

Open
wenyangfu opened this issue Aug 5, 2016 · 6 comments
Open

Code for Minibatch Discrimination on MNIST Missing #4

wenyangfu opened this issue Aug 5, 2016 · 6 comments

Comments

@wenyangfu
Copy link

Hi, I've been trying to reproduce a minibatch discrimination GAN for MNIST based on the paper, but I keep getting poor results. Would it be possible for "train_mnist_minibatch_discrimination.py" to be uploaded to the repo? I assume it exists, since MNIST digits generated via minibatch discrimination were shown in the paper. Thanks for your time!

@xunhuang1995
Copy link

+1

@djsutherland
Copy link

👍 We're trying to compare to your results, and it'd be really helpful to be able to be sure we didn't get something stupid wrong in plugging together train_cifar_minibatch_discrimination and train_mnist_feature_matching.

@djsutherland
Copy link

@wenyangfu @xunhuang1995: I talked to the authors about this offline, and they told me that the just used the same model as for CIFAR/SVHN for MNIST minibatch. My fork has a train_mnist_minibatch_discrimination.py that implements that (just loading the MNIST data instead of CIFAR, basically).

@rafaelvalle
Copy link

rafaelvalle commented Feb 12, 2017

@AilsaF
Copy link

AilsaF commented May 15, 2017

Hi @dougalsutherland, just wondering does it mean minibatch can only work with conv discriminative model instead of dense layer one?

@djsutherland
Copy link

@AilsaF I don't know if it wouldn't work with dense layers, just that they used the convolutional one. You could try it. :)

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

5 participants