Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Local Response Normalization backpropagation #10

Open
4fur4 opened this issue Oct 13, 2014 · 3 comments
Open

Local Response Normalization backpropagation #10

4fur4 opened this issue Oct 13, 2014 · 3 comments

Comments

@4fur4
Copy link

4fur4 commented Oct 13, 2014

Hi!

Im checking the implementation in C++ of the local response normalization here:

https://github.com/vlfeat/matconvnet/blob/master/matlab/src/bits/normalize.cpp

Based on Hinton´s paper http://www.cs.toronto.edu/~fritz/absps/imagenet.pdf we have that the response-normalized activity is given by the formula (notation adapted for readability):

activation

which totally fits with the implementation mentioned above. To get the back-propagation formulas we have that if

c

then

gradient

If im not wrong, this maps to the C++ implementation as

map

which should lead to the formula

formula1

however, the implementation is (lines 276-280)

formula2

Note the change zat(q) -> zat(t). Is there anything wrong there that I didnt notice?

Thank you!

Urko

@johnny5550822
Copy link

I am interested in the answer of this question too.

@jroose
Copy link

jroose commented Mar 15, 2015

I'm not convinced that the negative term really makes a difference. In Hinton's paper he sets the value of K=2, beta=0.75, and alpha=1e-4. Using those values, I believe the negative term is almost certainly negligible because the positive term is likely to be so much larger.

Math isn't my specialty though, so I'd be interested in your opinion Urko. Did you happen to come to a similar conclusion?

@easten20
Copy link

I think the result is the same, it just assigned to different variable though,
instead of assigning it to yat(t) they assign it to yat(q).

https://github.com/vlfeat/matconvnet/blob/b7dd9c963541582faa04572f510e0cc20545e086/matlab/src/bits/impl/normalize_cpu.cpp

line 271 -275

yat(q) -= zat(t) * xat(t) * xat(q) * ab2 * Lbeta1 ;

is that line of code did you mean?

lenck added a commit that referenced this issue Apr 13, 2016
Added support to print the DagNN with MATLAB's digraph plotting commands
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

4 participants