Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

No learning rate annealing during training. #3

Closed
ylfzr opened this issue May 26, 2018 · 2 comments
Closed

No learning rate annealing during training. #3

ylfzr opened this issue May 26, 2018 · 2 comments

Comments

@ylfzr
Copy link

ylfzr commented May 26, 2018

Hi Chunyuan,

Thanks for sharing. I found that in the 2-D simulation experiment the learning rate(injected Gaussian noise level) is kept constant, which doesn't satisfy the assumption 1 in your AAAI '16 paper. While in previous works e.g. Welling 2011, I found a polynomial decay scheme is applied. Will this be a problem?

@ChunyuanLI
Copy link
Owner

In theory, the learning rate decay guarantees the asymptotic convergence. In practice, its implementation varies case by case.

Depending on your purpose, the answer can be different.
(1) To validate the theory, or pursue better performance, the decay scheme may help.
(2) To see the performance difference between SGLD and pSGLD, I don't think the decay scheme matters much, to see pSGLD algorithm gives better samples by adjusting the learning rate. One experiment you can try is to compare the two algorithms with the same learning rate decay. That said, perhaps more rigorous comparison is to use the same learning rate decay. I wish I have done it at the first place.

@ylfzr
Copy link
Author

ylfzr commented May 27, 2018

I tried pSGLD on the first experiment in [Welling 2011], which has 2 modes in the posterior and the two variables are strongly negatively correlated. I get proper posterior samples with similar learning rate annealing scheme.
Thanks.

@ylfzr ylfzr closed this as completed Nov 27, 2018
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants