Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Should we change the epsilon in batch_norm_layer to a variable instead of a fixed value 1e-5. #5548

Closed
peterzhang2029 opened this issue Nov 10, 2017 · 4 comments · Fixed by #5692

Comments

@peterzhang2029
Copy link
Contributor

In other DL platform(such as tensorflow, mxnet.), the default value of epsilon is 0.001

@jacquesqiao
Copy link
Member

jacquesqiao commented Nov 16, 2017

what do you mean by variable?

@jacquesqiao
Copy link
Member

jacquesqiao commented Nov 16, 2017

now the epsilon in batch_norm_layer is also an attribute that can be configured. https://github.com/PaddlePaddle/Paddle/blob/develop/python/paddle/v2/fluid/layers.py#L769

@peterzhang2029
Copy link
Contributor Author

I mean the previous v2 version, the implementation is:
https://github.com/PaddlePaddle/Paddle/blob/develop/python/paddle/trainer_config_helpers/layers.py#L3030

@jacquesqiao
Copy link
Member

Got it! ~

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging a pull request may close this issue.

2 participants