Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Add ability to save and load param store #22

Closed
ndgAtUber opened this issue Jun 27, 2017 · 6 comments
Closed

Add ability to save and load param store #22

ndgAtUber opened this issue Jun 27, 2017 · 6 comments

Comments

@ndgAtUber
Copy link
Collaborator

We need the ability to easily (and reasonably quickly) save the param store to disk, and re-load it.

This is useful in a long training run both for saving state for future analyses and for checkpointing in case of a crash.

@eb8680
Copy link
Member

eb8680 commented Jun 27, 2017

We should also think about an interface and implementation to save and load models themselves. With cloudpickle we can serialize pretty much anything in Python except for running/suspended generators and coroutines.

@karalets
Copy link
Collaborator

karalets commented Jul 6, 2017

For the parameter store I envision something to the effect of pyro.save(model_params,"model_params.pr") and pyro.load("model_params.pr") which will save and load the name-space and the affiliated parameters.

We may need to think about being able to define which names we want to load and mapping other names to these parameters, for instance when initializing another model partially with pre-trained parameters or some such hacky business.

@ngoodman
Copy link
Collaborator

ngoodman commented Jul 6, 2017

this seems like the right first pass.

i think (from conversations) the main tricky thing is going to be inserting params into neural net modules when we re-load. presumably pytorch provides (or can be coerced to provide) some way to set the params of a module. we may need to pass into pyro.load a handle to modules or something like that....

the alternative is pickling the entire module and re-loading it. that seems like a less good idea to me.

@eb8680
Copy link
Member

eb8680 commented Jul 6, 2017

some way to set the params of a module

See here, should be pretty straightforward: serializing models in PyTorch

@ngoodman
Copy link
Collaborator

ngoodman commented Jul 6, 2017

ah, cool. so as long as the param store keeps around a handle to the models from pyro.module calls we should be good to use the first method.

@eb8680
Copy link
Member

eb8680 commented Jul 16, 2017

Addressed by #47

@eb8680 eb8680 closed this as completed Jul 16, 2017
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Development

No branches or pull requests

5 participants