Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

How to solve the issue about drop of loss of discriminator is too slow ? #14

Closed
wangxiao5791509 opened this issue Mar 31, 2017 · 2 comments

Comments

@wangxiao5791509
Copy link

Dear authors:
The following screen shot is my training log, I think the decreasing of loss of discriminator is slow, how can I speed it up ? I first training the network using bce loss, then load pretrained model and continue the training with adversal loss. Do I made any mistakes ? Looking forward for your replay. Thanks very much !!!

screenshot from 2017-03-31 22 17 26

@junting
Copy link
Collaborator

junting commented Mar 31, 2017

Hello @wangxiao5791509 ,
You can probably change the weighting hyperparameter alpha for a smaller value. But it is normal that the drop of adversarial cost is slow to maintain the stability of the adversarial training.
class ModelSALGAN(Model): def __init__(self, w, h, batch_size=32, G_lr=3e-4, D_lr=3e-4, alpha=1/20.):
Best,
Junting

@wangxiao5791509
Copy link
Author

@junting Thanks for your suggestions.

@junting junting closed this as completed Oct 22, 2017
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants