Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Feature request: low-rank automatic differentiation variational inference #2750

Open
wjn0 opened this issue Apr 20, 2019 · 0 comments · May be fixed by #3022
Open

Feature request: low-rank automatic differentiation variational inference #2750

wjn0 opened this issue Apr 20, 2019 · 0 comments · May be fixed by #3022

Comments

@wjn0
Copy link

wjn0 commented Apr 20, 2019

Summary:

Continues a discussion with @avehtari from here. Distilled down: full-rank ADVI is constrained by memory. The mean-field approximation can be problematic for certain models. A sensible intermediate (a low-rank implementation) for certain models would be very helpful. Ong et al. (2017) described one possible implementation.

Description:

I'll briefly outline the mathematical approach of Ong et al., and leave the Stan-specific implementation details (most of which were kindly outlined in the preceding discussion) for the pull request. To generate the parameters of the model: if n is the dimension of the parameters, and r is the desired rank of our approximation, we draw eta = (z, eps) from the r + n dimensional identity Gaussian. Then zeta is distributed according to N(mu, BB^T + diag(d^2)) where mu and d are n-dimensional and B is n x r and constrained to be lower-triangular, and can be obtained from eta by the reparameterization trick with the formula zeta = mu + Bz + d * eps. zeta is then transformed to the model parameters according to ADVI.

Additional info:

I've started working on an implementation and will open a PR now.

Current Version:

v2.19.1

@wjn0 wjn0 linked a pull request Mar 17, 2021 that will close this issue
7 tasks
@rok-cesnovar rok-cesnovar linked a pull request Mar 23, 2021 that will close this issue
7 tasks
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging a pull request may close this issue.

1 participant