Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Relation to mean-field variational inference. #335

Open
trivialfis opened this issue Oct 16, 2023 · 0 comments
Open

Relation to mean-field variational inference. #335

trivialfis opened this issue Oct 16, 2023 · 0 comments

Comments

@trivialfis
Copy link

trivialfis commented Oct 16, 2023

Hi, this is a question regarding the relationship between natural gradient boosting and variational inference. In the most general sense, any optimization method that approximates a density can be considered a variational inference method (instead of strictly referring to the approximation of a posterior). In practice, most VI methods optimize the KL-divergence by using a proxy called ELBO. The NGBoost looks quite similar to the mean-field VI, which assumes latent variables are mutually independent, but I'm struggling to link the two methods. Would be great if anyone here has looked into a similar issue before and could share some insights. Thank you in advance!

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

1 participant