Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Quantizeing the Fully-connected Layer--->look-up table #18

Closed
zzqiuzz opened this issue Sep 20, 2017 · 2 comments
Closed

Quantizeing the Fully-connected Layer--->look-up table #18

zzqiuzz opened this issue Sep 20, 2017 · 2 comments

Comments

@zzqiuzz
Copy link

zzqiuzz commented Sep 20, 2017

Hi! In your paper with respect to quantizing the FC layers,you divide weight matrix into M subspaces represented by a product of D and B.And during the test-phase, you store the results given by the computation of the inner products between S(m) and every sub-codeword in D(m) int a look-up table. For inputs as images which are different, they give different inputs S(m),so how's a look-up table working? Thank you!

@jiaxiang-wu
Copy link
Owner

jiaxiang-wu commented Sep 21, 2017

The look-up table is computed on-the-fly, which means for each input image, we will generate a new look-up table, determined by its input S(m) and the pre-trained sub-codebook D(m).
The test-phase time complexity includes two parts: 1) computation of look-up tables and 2) computation of layer response based on these look-up tables. The overall time complexity is still lower than the standard computation routine.

I guess the term "pre-computed look-up tables" may lead to some of your misunderstanding. The "pre-computed" word actually refers that look-up tables are computed in advance to the computation of layer response, rather than computed in the training phase. Sorry for that.

@zzqiuzz
Copy link
Author

zzqiuzz commented Sep 21, 2017

Quite clear. thanks.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants