Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Quastions about weights update #8

Open
yananliusdu opened this issue Jun 3, 2020 · 1 comment
Open

Quastions about weights update #8

yananliusdu opened this issue Jun 3, 2020 · 1 comment

Comments

@yananliusdu
Copy link

Hi Matthieu,

thanks for your BinaryConnect paper and implementation of it here, which is really inspiring and helpful. I got a concern about updating the trained weights. Empirically, you know, for each backpropagation, parameter changes after gradient descent are tiny, which is also illustrated in your paper. with these tiny changes on weights, after binarization of each epoch, it is possible that these binarized weights remain unchanged, for example, it is hard to change from 1 to -1 due to the tiny changes on weights. And based on that, after the forward pass, each epoch of result may be similar, which in turn result in more tiny changes on weights. Hince, after several epochs of training, the weights are hardly updated and it is still far away from optimization. How do you solve this issue?

another question is:
Annotation 2020-06-02 184121 In your figure, after training, the distribution of weights are around -1 and 1. I don't know why but my training weights seems like a little bit random? do you know why?

thanks.

@zhaoxiangshun
Copy link

Hi yananliusdu!
For the first question,the auther used a variable of W_Lr_scaler to scale the learn rate ,so the weight update will not too hard.The detial you can see in the file of binary_connect.py of clipping_scaling function.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants