Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Batch normalization layer have 4 parameters. #1523

Closed
hafssol opened this issue Jan 21, 2016 · 9 comments
Closed

Batch normalization layer have 4 parameters. #1523

hafssol opened this issue Jan 21, 2016 · 9 comments

Comments

@hafssol
Copy link

hafssol commented Jan 21, 2016

When I execute model.get_weights() , It seems each BN layer have 4 parameters serially
I believe there are gamma , inverse stdev , Gamma , beta are included.
But I cannot sure the order of the parameter.
what is the first parameter?
what is the second parameter?
what is the third parameter?
what is the fourth parameter?
It will be grateful for me to know.

@fchollet
Copy link
Member

But I cannot sure the order of the parameter.

Well, read the source code! It's straightforward.

@hafssol
Copy link
Author

hafssol commented Jan 22, 2016

well I read it hours. But I cannot understood. (i'm not good at code.)
please give me a hint.

@AvantiShri
Copy link

Looks like gamma, beta, running mean and running std is the order (super.get_weights() returns the weights of self.params and then running mean and std are tacked on): https://github.com/fchollet/keras/blob/master/keras/layers/normalization.py#L61-L72

@bluebirdlboro
Copy link

bluebirdlboro commented Jan 26, 2018

AvantiShri is right. And you can find the answer from this keras document
https://faroit.github.io/keras-docs/1.2.2/layers/normalization/
It explains weights order is : [gamma, beta, mean, std]
Somehow, another keras document https://keras.io/layers/normalization/ lacks of the note for the 'weights' parameter, which is quit confusing.

@dddaga
Copy link

dddaga commented Apr 12, 2019

Looking at the code : https://github.com/keras-team/keras/blob/master/keras/layers/normalization.py#L61-L72

I think it should be [gamma, beta, mean , variance]

@rsnayaksd
Copy link

rsnayaksd commented Aug 3, 2019

In case of Batch Normalization get_weights() will provide four additional parameter values for each hidden layers in neural network.

Please refer link https://stackoverflow.com/questions/57087273 for more details.

@inspirepassion
Copy link

Looks like gamma, beta, running mean and running std is the order (super.get_weights() returns the weights of self.params and then running mean and std are tacked on): https://github.com/fchollet/keras/blob/master/keras/layers/normalization.py#L61-L72

I'm wondering where is this line in source code: super.get_weights()
I know several ppl mentioned that the order is stated in the document, but am I the only one that couldn't find it? Or they just removed the part that you guys once saw.

@jennakwon06
Copy link

But I cannot sure the order of the parameter.

Well, read the source code! It's straightforward.

"Straight forward" is subjective & "Read the source code" is not a great reply. Well written and clear code documentation is expected for any usable codebase and what you are saying is that it's OK for many people to be wasting their hours sifting through the codebase..

@Yanagar1
Copy link

the last one is variance!

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

9 participants