-
Notifications
You must be signed in to change notification settings - Fork 753
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Local Response Normalization backpropagation #10
Comments
I am interested in the answer of this question too. |
I'm not convinced that the negative term really makes a difference. In Hinton's paper he sets the value of K=2, beta=0.75, and alpha=1e-4. Using those values, I believe the negative term is almost certainly negligible because the positive term is likely to be so much larger. Math isn't my specialty though, so I'd be interested in your opinion Urko. Did you happen to come to a similar conclusion? |
I think the result is the same, it just assigned to different variable though, line 271 -275 yat(q) -= zat(t) * xat(t) * xat(q) * ab2 * Lbeta1 ; is that line of code did you mean? |
Added support to print the DagNN with MATLAB's digraph plotting commands
Hi!
Im checking the implementation in C++ of the local response normalization here:
https://github.com/vlfeat/matconvnet/blob/master/matlab/src/bits/normalize.cpp
Based on Hinton´s paper http://www.cs.toronto.edu/~fritz/absps/imagenet.pdf we have that the response-normalized activity is given by the formula (notation adapted for readability):
which totally fits with the implementation mentioned above. To get the back-propagation formulas we have that if
then
If im not wrong, this maps to the C++ implementation as
which should lead to the formula
however, the implementation is (lines 276-280)
Note the change zat(q) -> zat(t). Is there anything wrong there that I didnt notice?
Thank you!
Urko
The text was updated successfully, but these errors were encountered: