Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

batch normalization question #292

Closed
ghost opened this issue Aug 23, 2016 · 3 comments
Closed

batch normalization question #292

ghost opened this issue Aug 23, 2016 · 3 comments

Comments

@ghost
Copy link

ghost commented Aug 23, 2016

Why doesn't the batch normalization layer apply scaling and shifting like in the original paper?

thanks,
Marijke

@edgarriba
Copy link
Member

/cc @nyanp

@nyanp
Copy link
Member

nyanp commented Aug 28, 2016

@HotMarijke
tiny-dnn's batch normalization is ported from caffe's one, which don't have scale and bias operation. We already have scaling and shifting layer named linear_layer, and if we can train it's scale and bias, we can emulate the original paper by combine them.

@ghost
Copy link
Author

ghost commented Aug 28, 2016

Okay, now I understand!
Thanks for this great project!

Marijke

@ghost ghost closed this as completed Aug 28, 2016
This issue was closed.
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Development

No branches or pull requests

2 participants