Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

About gan loss #32

Closed
zhangqianhui opened this issue Sep 15, 2020 · 1 comment
Closed

About gan loss #32

zhangqianhui opened this issue Sep 15, 2020 · 1 comment

Comments

@zhangqianhui
Copy link

Hello @taesungp @junyanz

self.loss = nn.BCEWithLogitsLoss()

I see you use both vanilla gan and nonsaturating with softplus

When you use BCEWithLogitsLoss:

For real samples:

L_{r} = BCE(logits1, 1) + BCE(logits, 0)
= -(1 * log(sigmoid(logits1)) + 0 * log(1 - sigmoid(logits1)) + 0 * log(sigmoid(logits2)) + 1 * (1 - sigmoid(logits2)))
= -(log(sigmoid(logits1)) + log(1 - sigmoid(logits2))))

For fake samples:

L_{f} = BCE(logits2, 1)
= -(log(sigmoid(logits2))

Obviously, it is else nonsaturating loss.

Thus, I don't understand your defination of for vanilla gan. Why do you both nonsaturating gan with two different funtions?
Maybe I misunderstand.

@taesungp
Copy link
Owner

taesungp commented Oct 4, 2020

You are absolutely right about that. It was merely an artifact of combining old codes of CycleGAN with the new pieces of codes. Sorry for the confusion.

@taesungp taesungp closed this as completed Oct 4, 2020
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants