Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

About the optimizer and scheduler #5

Closed
Pay20Y opened this issue Jan 21, 2022 · 4 comments
Closed

About the optimizer and scheduler #5

Pay20Y opened this issue Jan 21, 2022 · 4 comments

Comments

@Pay20Y
Copy link

Pay20Y commented Jan 21, 2022

There seems to be no code call the function "CustomCosineAnnealingWarmupRestarts"? I wonder if the optimizer and scheduler you used are AdamW and Cosine Annealing?
Thanks!

@YongWookHa
Copy link
Owner

The CustomCosineAnnealingWarmupRestarts Code is in utils.py.

class CustomCosineAnnealingWarmupRestarts(_LRScheduler):

Have a look at it, plz :)

@Pay20Y
Copy link
Author

Pay20Y commented Jan 21, 2022

Thanks for you quick reply. My question is that there seems to be no code use the function "CustomCosineAnnealingWarmupRestarts", are there some special usages in the pytorch_lightning?

@YongWookHa
Copy link
Owner

Oh, I guess I misunderstood your question.

scheduler = getattr(utils, self.cfg.scheduler)

Here you can find the usage of CustomCosineAnnealingWarmupRestarts .

It depends on what you wrote in your yaml file which contains the settings.

scheduler: "CustomCosineAnnealingWarmupRestarts"

As you can check, default.yaml says it will use CustomCosineAnnealingWarmupRestarts .

image

This image shows how the learning rate moves with CustomCosineAnnealingWarmupRestarts .

I wrote a tech post in my blog. Though it's in Korean language, you can use browser translation with chrome or edge:)

Hope this helps you.

@Pay20Y
Copy link
Author

Pay20Y commented Jan 21, 2022

I got it, thanks again!

@Pay20Y Pay20Y closed this as completed Jan 21, 2022
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants