Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[FR] SVI: avoid initializing model params when already specified in guide #1552

Closed
msegado opened this issue Mar 8, 2023 · 1 comment · Fixed by #1553
Closed

[FR] SVI: avoid initializing model params when already specified in guide #1552

msegado opened this issue Mar 8, 2023 · 1 comment · Fixed by #1553

Comments

@msegado
Copy link
Contributor

msegado commented Mar 8, 2023

(Context: https://forum.pyro.ai/t/sharing-params-between-model-guide-i-e-tied-weights/5033/)

Sharing parameter values between a model and its guide is helpful in some applications, e.g. symmetric autoencoders. This can almost be accomplished by defining a param as usual in the guide and calling numypro.param("shared_site_name") in the model, but since SVI only replays sample values and not parameter values before tracing the model, the return value of this param() call defaults to None which breaks any downstream code that uses it.

The desired behavior in SVI is to avoid initializing any model parameters already specified in the guide. I can see a couple of ways of accomplishing this:

  1. Explicitly substitute initialized parameter values from the guide before tracing the model. Here's a strawman implementation which uses substitute() to avoid changing the semantics of replay(): master...msegado:numpyro:feat-shared-param-init.
  2. Change the replay() handler to support replaying param values. This should probably be opt-in via an include_params kwarg or similar to avoid breaking existing user code, and would also require a change to the docs.

The first seems like the more natural approach to me, but I figured it would be good to ask before opening a PR 🙂 Thoughts?

@fehiepsi
Copy link
Member

fehiepsi commented Mar 8, 2023

Oh, the first solution sounds great to me! Thanks!

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Development

Successfully merging a pull request may close this issue.

2 participants