Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Learning rate #6

Closed
mengjingyouling opened this issue Sep 28, 2022 · 2 comments
Closed

Learning rate #6

mengjingyouling opened this issue Sep 28, 2022 · 2 comments

Comments

@mengjingyouling
Copy link

hi,Very solid work, I have a question:

Can the learning rate of discriminator be updated? Because in the function class DomainDiscriminator(nn.Sequential): "lr": 1.

def get_parameters(self) -> List[Dict]:
    return [{"params": self.parameters(), "lr": 1.}]

**lr_scheduler_ad = LambdaLR(
    ad_optimizer, lambda x: args.lr * (1. + args.lr_gamma * float(x)) ** (-args.lr_decay))**

Does it work?

Because we found in the experiment that the discrimination loss remained stable when the epoch was very small, although the task loss was still decreasing.

We look forward to your reply
Thank you!

@mengjingyouling
Copy link
Author

@rangwani-harsh
Copy link
Contributor

Yes, the lr can be changed. However, we only experiment a little on the effect of lr on the discriminator. Also, we find that the loss should oscillate for adversarial learning to ensure good performance.

I just wanted to let you know that I'm closing this issue for now. Please re-open in case you have further queries.

Thanks
Harsh

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants