Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

G is always really smaller than D #47

Closed
kristosh opened this issue Sep 10, 2018 · 0 comments
Closed

G is always really smaller than D #47

kristosh opened this issue Sep 10, 2018 · 0 comments

Comments

@kristosh
Copy link

I am training a generative adversarial network to perform style transfer from two different image domains (source and target). Since I have available class information i have an extra Q network (except G and D) that measures the classification results fo the generated images for the target domain and their labels. From the convergence of the system I have noticed that D is starting from 8 (the error of the network) and slightly drops until 4.5 and the generator error is starting from 1 and quickly drops to 0.2. Is that behaviour an example of mode-collapse? What is exactly the relationship between the errors of D and G? The loss function of D and G I am using can be found here https://github.com/r0nn13/conditional-dcgan-keras while the loss function of Q network is categorical cross-entropy. The loss functions can be found here: https://imgur.com/a/bDrTcpm

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

1 participant