Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

fix bug: gradient_clipping_threshold should be allowed to set with parameter-grain #1010

Merged
merged 1 commit into from
Dec 26, 2016

Conversation

backyes
Copy link
Contributor

@backyes backyes commented Dec 25, 2016

  • gradient_clipping_threshold should be allowed to set with parameter-grain

新接口只允许基于setting设置全局的gradient_clipping_threshold, 相对paddle config_parser.py接口损失了灵活性,而且全局的阈值似乎也不太合理。

 * gradient_clipping_threshold should be allowed to set with parameter-grain
@backyes backyes self-assigned this Dec 25, 2016
@backyes
Copy link
Contributor Author

backyes commented Dec 25, 2016

相关问题在 review sparse 设计实现时,发现。 请 @reyoung @pengli09 两位看看是否需要改进。

@pengli09
Copy link
Contributor

@backyes @reyoung 只能设置全局的gradient_clipping_threshold确实不是很合理,老版本的应该是支持每个参数分别设置的。

Copy link
Contributor

@pengli09 pengli09 left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

LGTM

Copy link
Collaborator

@reyoung reyoung left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

LGTM. Excellent Job!

@reyoung reyoung merged commit 9ae7a10 into PaddlePaddle:develop Dec 26, 2016
zhhsplendid pushed a commit to zhhsplendid/Paddle that referenced this pull request Sep 25, 2019
* add 1.5.1 whl and fix some bugs (PaddlePaddle#1010)

* add windows install whl

* whl and bug

* fix some bugs

* update 1.5.1 cn API

* add url
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

None yet

3 participants