-
Notifications
You must be signed in to change notification settings - Fork 725
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Conditional WGAN #16
Comments
Have you tried passing your noise through a dense layer and then concatenating its output with the labels. |
@rafaelvalle Yes, I've tried combining CGAN or ACGAN, but it never worked. |
@aosokin Did you try providing both the critic and generator with the condition? |
@rafaelvalle there should be some theoretical reasoning behind this. I would assume that adding the conditioning vector once to the data should be sufficient. I believe that is the case for vanilla GANs. |
@rafaelvalle yes, I've tried several things like they go in the ACGAN paper, but never could make it work. |
@aosokin Did you try looking at the gradients close to your noise and condition vector? |
Towards convergence most of the parameters are clipped, so the gradients all look outside.
yes, to [-1, 1]
I've tried 6 classes on some images. I should probably try it on a MNIST, but did not have time. I'm curious if there is any success story on conditioning WGAN on anything. |
@aosokin I was successful conditioning WGAN on ~70 classes with chromatic images. |
Could you share a code snippet on what worked?
But that's image conditioned on image, which is even harder. For me, even conditioning on labels did not work. Maybe I just did something wrong. |
@aosokin https://github.com/rafaelvalle/neural_network_control_improvisation/blob/master/wcgan_text.py |
@rafaelvalle Great, thanks a lot! I'll try this version out |
@aosokin @martinarjovsky any suggestions on how to combine the WGAN loss with a classification loss, say categorical crossentropy? |
@rafaelvalle Hi, I've tried some things along the lines of your code, but it still does not work, but I still observe the same result: the conditioning is ignored in the end of the day.
This seems to do a very different thing.
Yes, I've tried that, but it did not change the outcome at all. |
@aosokin Did you try conditioning à la wavenet, z = tanh(Wk * X + VkC), where * is convolution and W and V are the weights at layer K? |
@rafaelvalle Hi, trying to view your first link above but it doesn't seem to be working. Could you re-post? |
Hi @aosokin , I'm trying something similar with conditional WGAN and observed similar behaviour, either conditioning is completely ignored or the network doesn't converge. I added a second loss to the generator as well (L2 loss) to 'guide' the results in the correct direction. This setup is pretty sensitive to lambda (the multiplier) for the GAN loss. The training now appears to be headed in the right direction but it is still early days yet. |
I've successfully trained a conditional WGAN as well. But I am using WGAN-GP version. Maybe that's the difference? |
@appierys Hi, thanks for the notice. Could you share some code snippets on how you did this? |
I've spent some time to implement WACGAN-GP for MNIST this morning. The results seem to be acceptable. The sample jupyter notebook is as follows: WACGAN-GP-MNIST |
@appierys Hi,Could you share some code to show how you do it |
@weihancug Hi, I think the notebook provided above already shows a working conditional WGAN ? |
I've found other implementations of cWGAN: Looks like WGAN is ok with conditional settings since many people could reproduce the results. |
Here's an implementation of conditional WGAN-GP. https://github.com/kongyanye/cwgan-gp |
Hi, have you tried to apply WGAN for conditional image generation?
Say, in the simplest scenario of conditioning on the class label.
I'm trying to do that, but observe some weird behavior:
Any suggestions?
The text was updated successfully, but these errors were encountered: