Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

貌似有个小bug,输出的学习速率lr不是实时的 #25

Closed
dcsaf opened this issue Dec 16, 2019 · 1 comment
Closed

貌似有个小bug,输出的学习速率lr不是实时的 #25

dcsaf opened this issue Dec 16, 2019 · 1 comment

Comments

@dcsaf
Copy link

dcsaf commented Dec 16, 2019

print('learning rate:',optimizer.param_groups[0]['lr']) 是在调整学习速率
lr = adjust_learning_rate(optimizer, 0.1, epoch, ni, nb)之前执行,打印的lr不是当前的lr,而是上一个epoch的学习速率lr

@tanluren
Copy link
Owner

看的很仔细,这是之前用scheduler时写的,后来用warmup忘记改了。已修正,谢谢哈

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants