Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

【Hackathon No.13】 #33

Merged
merged 3 commits into from Mar 15, 2022
Merged

【Hackathon No.13】 #33

merged 3 commits into from Mar 15, 2022

Conversation

Asthestarsfalll
Copy link
Contributor

为paddle新增CyclicLR优化调度器

@dingjiaweiww
Copy link
Contributor

你的 PR 提交成功,感谢你对于开源项目的贡献,请检查 PR 提交格式和内容是否完备,具体请参考示例模版

@dingjiaweiww
Copy link
Contributor

PR 格式检查通过,你的PR 将接受Paddle 专家以及开源社区的review,请及时关注PR 动态

zhiboniu
zhiboniu previously approved these changes Mar 15, 2022
Copy link
Contributor

@zhiboniu zhiboniu left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

lgtm


去除了Pytorch中`momentum`的相关参数。

同时,为了保持与paddle其他lrscheduler相关的api保持一致,将`base_lr`修改为`learning_rate`,`max_lr`修改为`max_learning_rate`,此修改有待商榷。
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

这里我觉得如果策略只用到一个lr,用learning_rate就可以了。如果用到两个相关联的lr,可能base_learning_ratemax_learning_rate意思更清楚一些

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

感谢建议


测试考虑的case如下:

- 动态图,静态图,与numpy的结果保持一致;
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

除了API TEST中用numpy实现验证,本地可以跟pytorch实现做验证。然后多参考其他优化器的验证代码是怎么做的。


## 命名与参数设计

API设计为`paddle.optim.lr.CyclicLR(learning_rate,max_learning_rate,step_size_up,step_size_down, mode='triangular',gamma=1.,scale_fn=None,scale_mode='cycle',last_epoch=-1,verbose=False)`

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

paddle优化器相关都在 paddle.optimizer下,这里叫optim是有什么考虑还是写错了?

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

不好意思,缩写写习惯了,稍后将会更新

Copy link

@DDDivano DDDivano left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

LGTM

@dingjiaweiww dingjiaweiww merged commit f797aaf into PaddlePaddle:master Mar 15, 2022
@dingjiaweiww
Copy link
Contributor

你的PR 已合入community库,请进行后续代码开发,并将代码提交至paddle仓库

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Development

Successfully merging this pull request may close these issues.

None yet

4 participants