-
-
Notifications
You must be signed in to change notification settings - Fork 111
Add Variational Inference Interface #280
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Conversation
|
Check out this pull request on Review Jupyter notebook visual diffs & provide feedback on notebooks. Powered by ReviewNB |
Codecov Report
@@ Coverage Diff @@
## master #280 +/- ##
==========================================
- Coverage 90.62% 89.97% -0.65%
==========================================
Files 29 32 +3
Lines 1919 2035 +116
==========================================
+ Hits 1739 1831 +92
- Misses 180 204 +24
|
|
You should subclass |
|
Thanks @junpenglao . I will check how to use |
ferrine
left a comment
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Looks good, I have mostly API comments and suggestions
|
@Sayam753 I would plot the elbo on log scale. |
|
Yes @twiecki . I am adding the changes. 😃 |
|
Here are some comparisons drawn for Mean Field ADVI in PyMC4, TFP and PyMC3 (gist). If this is the expected behaviour and for the next steps, shall I go about writing tests for the interface implemented so far? I wish to add Full Rank ADVI in another PR to avoid making this PR large enough to review. |
|
Pretty weird that the results are so different, might be worth to dig a bit deeper. |
|
Plots look much better now!
…On Sat, Jun 20, 2020 at 7:57 PM Sayam Kumar ***@***.***> wrote:
***@***.**** commented on this pull request.
------------------------------
In pymc4/variational/approximations.py
<#280 (comment)>:
> + return tf.vectorized_map(lambda samples: logpfn(*samples), q_samples)
+
+ return vectorized_logpfn
+
+ return vectorize_logp_function(logpfn)
+
+ def _build_posterior(self):
+ raise NotImplementedError
+
+ def flatten_view(self):
+ """Flattened view of the variational parameters."""
+ pass
+
+ def sample(self, n):
+ """Generate samples from posterior distribution."""
+ q_samples = dict(zip(self.unobserved_keys, self.approx.sample(n, seed=self._seed)))
Resolved in a1d9679
<a1d9679>
—
You are receiving this because you were mentioned.
Reply to this email directly, view it on GitHub
<#280 (comment)>, or
unsubscribe
<https://github.com/notifications/unsubscribe-auth/AAFETGGK4HVEQCUD6RI5TZ3RXTZ63ANCNFSM4NWDWMUA>
.
|
ferrine
left a comment
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Polising docs and tests is left before shipping. Some work is left beyond this PR (deterministic variables and flattening the model for full rank ADVI)
Added `sample` method for posterior distribution. Updated quickstart notebook.
Added a new axis to handle ArviZ shape issues Changed initialization of std to 1 Created updates module to account for optimizers Added test_variational.py Updated quick_start notebook
|
Looks good, I'll merge once tests pass |
Added Mean Field ADVI along with basic interface. I observe two different ways of adding VI approximations. One by tfd.JointDistributionSequential and other by tfd.MultivariateNormalDiag.
This interface incorporates the Sequential method. It may be possible to use
tfd.MultivariateNormalDiagafter having a flattened view of parameters. Any feedback at this implementation is welcomed.