-
-
Notifications
You must be signed in to change notification settings - Fork 2k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
initialize chains from estimated posterior samples #1655
Conversation
if njobs > 1: | ||
start = pm.variational.sample_vp(v_params, njobs, progressbar=False, hide_transformed=False) | ||
else: | ||
start = v_params.means |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I think we always want a sample from the posterior to start in the "typical set". The mean can be far away from that, which is counter-intuitive, but true for high-dimensional models: https://www.youtube.com/watch?v=pHsuIaPbNbY
ok. I will revert that. I guess nuts (when jobs = 1) should also start
from a posterior sample and not from the mean of the posterior, right?
I will check the video late (very bad Internet connection right now).
El ene 7, 2017 2:59 PM, "Thomas Wiecki" <notifications@github.com> escribió:
… ***@***.**** commented on this pull request.
------------------------------
In pymc3/sampling.py
<#1655 (review)>:
> @@ -432,7 +438,10 @@ def init_nuts(init='advi', n_init=500000, model=None, **kwargs):
if init == 'advi':
v_params = pm.variational.advi(n=n_init)
- start = pm.variational.sample_vp(v_params, 1, progressbar=False, hide_transformed=False)[0]
+ if njobs > 1:
+ start = pm.variational.sample_vp(v_params, njobs, progressbar=False, hide_transformed=False)
+ else:
+ start = v_params.means
I think we always want a sample from the posterior to start in the
"typical set". The mean can be far away from that, which is
counter-intuitive, but true for high-dimensional models:
https://www.youtube.com/watch?v=pHsuIaPbNbY
—
You are receiving this because you authored the thread.
Reply to this email directly, view it on GitHub
<#1655 (review)>,
or mute the thread
<https://github.com/notifications/unsubscribe-auth/ABRuTpNwrPgm7s-zScU2Hdnih3KwAO5tks5rP9KXgaJpZM4LddM7>
.
|
Exactly. |
That's right, but the discarded values will only affects the starting points. Then it will use the whole array to do the actual sampling. |
@twiecki @ColCarroll any (new) thoughts on this? |
Ah, just checked, and If you want to land this now, we'll have to remember to change this when the random_seed fix lands. |
(so, this all looks good, but I think there's a subtle problem elsewhere!) |
Oh, heh, #1656 is the problem! |
|
||
start = {varname: np.mean(init_trace[varname]) for varname in init_trace.varnames} | ||
init_trace = pm.sample(step=pm.NUTS(), draws=n_init, | ||
random_seed=random_seed)[n_init//2:] |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
wrong indent.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
@aloctavodia once these minor issues are fixed we can start to merge this I think.
cov = np.power(model.dict_to_array(v_params.stds), 2) | ||
elif init == 'advi_map': | ||
start = pm.find_MAP() | ||
v_params = pm.variational.advi(n=n_init, start=start) | ||
v_params = pm.variational.advi(n=n_init, start=start, | ||
random_seed=random_seed) |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
wrong indent
Thanks @aloctavodia! |
For
advi
andnuts
and whennjobs > 1
it will sample starting points from the estimated posterior, otherwise it will use the estimated posterior mean.