Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

The labels of valid and fake in wgan #136

Closed
Pandarenlql opened this issue Mar 27, 2019 · 6 comments
Closed

The labels of valid and fake in wgan #136

Pandarenlql opened this issue Mar 27, 2019 · 6 comments

Comments

@Pandarenlql
Copy link

Pandarenlql commented Mar 27, 2019

Hello, I have a question about the labels of valid and fake. Generally, we need set the valid array are filled of 1, and the fake array are filled of 0. So I don't know why in wgan, author set the valid -1 and the fake 1.
I will appreciate for everyone who help me.
# Adversarial ground truths
valid = -np.ones((batch_size, 1))
fake = np.ones((batch_size, 1))

@MmDawN
Copy link

MmDawN commented Apr 8, 2019

Hello, have you solved the question?

@MmDawN
Copy link

MmDawN commented Apr 8, 2019

I have searched the offical Keras implementation of WGAN_GP in the repositories https://github.com/keras-team/keras-contrib/blob/master/examples/improved_wgan.py#L300 ,and they set the valid to 1 and the fake to -1. I thought this might be the right inplementation of setting labels.

@Pandarenlql
Copy link
Author

Thanks for your answer, I have solved the question. A few days ago, I found a file introduced why they set the valid to 1 and the fake to -1. But I can't find the website now.
In my understanding, if we set the valid to 1 and the fake to -1, then we can use the code K.mean(y_true * y_pred) to get the result of wasserstein distance easily. Through these settings, we just need some multiplications and addition operations to get the wasserstein distance.
无标题

@adaxidedakaonang
Copy link

Hello, I have some confuse about D valid is set to 1, I guess this is to get w-distance easily instead of getting the probability of real data which is 0 to 1. Is this right?

@Pandarenlql
Copy link
Author

Sorry for ignore your question, I had a competition the other days, so I have no time to answer you.
In my understanding, author make the valid -1 and fake 1 can give the negative x~Pr and positive x~Pg, then we use
`d_loss_real = self.critic.train_on_batch(imgs, valid)
d_loss_fake = self.critic.train_on_batch(gen_imgs, fake)

d_loss = 0.5 * np.add(d_loss_fake, d_loss_real)`
to get the -W(Pr, Pg), so we can through minimize the -W(Pr, Pg) to get the maximum of W(Pr, Pg).

@adaxidedakaonang
Copy link

Thank you~

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

3 participants