Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

About Learning Rate Scheduler #19

Closed
1049451037 opened this issue Jan 17, 2022 · 2 comments
Closed

About Learning Rate Scheduler #19

1049451037 opened this issue Jan 17, 2022 · 2 comments
Labels
question Further information is requested

Comments

@1049451037
Copy link

❔Question

Why the step of learning rate scheduler after each epoch instead of each batch in main.py?

Won't the change rate of lr be too slow? (and unstable for various dataset sizes)

@1049451037 1049451037 added the question Further information is requested label Jan 17, 2022
@Yuxin-CV
Copy link
Member

Hi, Qingsong. Thanks for this issue.

To my knowledge, in image recognition, the lr is usually stepped by epoch if you choose the cosine lr scheduler, e.g., in the widely used timm library.

It seems that in NLP the lr scheduler is stepped after each iteration / batch. e.g., in the BEiT repo. This is also true for semantic segmentation in vision.

I agree with you that step by iteration is more reasonable than step by epoch, and step by iteration should yield no worse results than step by epoch.

@1049451037
Copy link
Author

Got it. Thank you for your reply!

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
question Further information is requested
Projects
None yet
Development

No branches or pull requests

2 participants