Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Make inference object serializable #617

Merged
merged 2 commits into from
Jan 28, 2022
Merged

Make inference object serializable #617

merged 2 commits into from
Jan 28, 2022

Conversation

michaeldeistler
Copy link
Contributor

@michaeldeistler michaeldeistler commented Jan 27, 2022

Resolves #476

Saving the inference object was always a bit messy. But dill used to work.

I just tried and could no longer save the inference object with dill. The workaround in this PR is to set the non-serializable objects to None before saving. This allows to save the inference object with pickle. These modifications lead to the following two changes in behaviour:

  1. Retraining from scratch is not supported, i.e. .train(..., retrain_from_scratch=True) does not work after loading.
  2. When the loaded object calls the .train() method, it generates a new tensorboard summary writer (instead of appending to the current one).

@codecov-commenter
Copy link

Codecov Report

Merging #617 (23f3225) into main (4b3d260) will decrease coverage by 0.13%.
The diff coverage is 16.66%.

Impacted file tree graph

@@            Coverage Diff             @@
##             main     #617      +/-   ##
==========================================
- Coverage   66.77%   66.63%   -0.14%     
==========================================
  Files          66       66              
  Lines        4307     4319      +12     
==========================================
+ Hits         2876     2878       +2     
- Misses       1431     1441      +10     
Flag Coverage Δ
unittests 66.63% <16.66%> (-0.14%) ⬇️

Flags with carried forward coverage won't be shown. Click here to find out more.

Impacted Files Coverage Δ
sbi/inference/base.py 72.66% <16.66%> (-4.87%) ⬇️

Continue to review full report at Codecov.

Legend - Click here to learn more
Δ = absolute <relative> (impact), ø = not affected, ? = missing data
Powered by Codecov. Last update 4b3d260...23f3225. Read the comment docs.

Copy link
Contributor

@janfb janfb left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Thanks! Looks good in itself, but I am not sure this is a long-term solution. Isn't there a way to serialise the full object without this workaround? We can merge this for now but I think it would be good to keep an issue open so that we find a sustainable solution later.

@janfb
Copy link
Contributor

janfb commented Jan 28, 2022

could this be relevant?
uqfoundation/dill#348

@michaeldeistler
Copy link
Contributor Author

I don't think the tensorboard writer can be serialized. See here

uqfoundation/dill#348 seems relevant, but they do not provide a fix either, no?

@michaeldeistler michaeldeistler merged commit 0448382 into main Jan 28, 2022
@michaeldeistler michaeldeistler deleted the pickle branch January 28, 2022 11:15
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

Using dill to save the inference object giving an error.
3 participants