Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

学习率如何调整?在yml文件中设置没有改变,一直都是0.0005 #72

Closed
gekie opened this issue May 20, 2020 · 3 comments
Closed

Comments

@gekie
Copy link

gekie commented May 20, 2020

如果说是动态调整,跑了一天一夜,LR还是0.0005,没看到有变化的log

@dyning
Copy link
Collaborator

dyning commented May 20, 2020

adam是自适应优化算法,学习率是内部调整的,上面显示的只是基础学习率。SGD+动量的修改方式可以参考如下:
(1)https://github.com/PaddlePaddle/PaddleOCR/blob/develop/ppocr/optimizer.py 这个文件中添加优化器的定义:

def PiecewiseDecay(params):
base_lr = params['base_lr']
gamma = params['gamma']
steps = params['steps']
momentum_rate = params['momentum_rate']
L2_decay_weight = params['L2_decay_weight']
bd = steps
lr = [base_lr * (0.1**i) for i in range(len(steps) + 1)]
learning_rate = fluid.layers.piecewise_decay(boundaries=bd, values=lr)
optimizer = fluid.optimizer.Momentum(
learning_rate=learning_rate,
momentum=momentum_rate,
regularization=fluid.regularizer.L2Decay(L2_decay_weight))
return optimizer

(2)然后在yml文件中修改优化器配置,

Optimizer:
function: ppocr.optimizer,AdamDecay
base_lr: 0.001
beta1: 0.9
beta2: 0.999
改为
Optimizer:
function: ppocr.optimizer,PiecewiseDecay
base_lr: 0.001
gamma: 0.1
steps: [300000]
momentum_rate: 0.9
L2_decay_weight: 0.0004

@gekie gekie closed this as completed May 20, 2020
@yyr6661
Copy link

yyr6661 commented May 21, 2020

adam基础学习率没法调整吗?不管怎么调整log中都是0.0005

@ZhaoyangLi-nju
Copy link

parameter_list,初始化的传入的是None?

BillDior pushed a commit to BillDior/PaddleOCR that referenced this issue Aug 13, 2021
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

4 participants