Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

What is the intended learning rate schedule? #16

Closed
ChanghwaPark opened this issue Apr 2, 2021 · 1 comment
Closed

What is the intended learning rate schedule? #16

ChanghwaPark opened this issue Apr 2, 2021 · 1 comment

Comments

@ChanghwaPark
Copy link

def adjust_learning_rate(optimizer, epoch, args):
epoch = epoch + 1
if epoch <= 5:
lr = args.lr * epoch / 5
elif epoch > 160:
lr = args.lr * 0.01
elif epoch > 180:
lr = args.lr * 0.0001
else:
lr = args.lr
for param_group in optimizer.param_groups:
param_group['lr'] = lr

Hi, thanks for sharing your code!

I have a question about the referenced code above.
In the 'adjust_learning_rate' function, the lines 34 and 35 will never be passed.
Can I ask the learning rate schedule that you used for experiments in the paper?

According to the 'adjust_learning_rate' function, the learning rate may change as follows.

epoch lr
0: args.lr * 1 / 5
1: args.lr * 2 / 5
2: args.lr * 3 / 5
3: args.lr * 4 / 5
4: args.lr * 5 / 5
5 ~ 160: args.lr
161~: args.lr * 0.01

@YyzHarry
Copy link
Owner

YyzHarry commented Apr 3, 2021

Hi - thanks for your interests! This should be a typo when I cleaned up the code (have updated it). For experiments on CIFAR/SVHN, the lr decay is by 0.01 at both the 160th and the 180th epoch.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants