Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Saving torchscript model in C++ #35464

Closed
oplatek opened this issue Mar 26, 2020 · 12 comments
Closed

Saving torchscript model in C++ #35464

oplatek opened this issue Mar 26, 2020 · 12 comments
Labels
oncall: jit Add this issue/PR to JIT oncall triage queue

Comments

@oplatek
Copy link

oplatek commented Mar 26, 2020

🚀 Feature

Allow torch::jit::script::Module in C++ to serialize the model back to disc.

I submitted the issue based on the thread https://discuss.pytorch.org/t/how-to-save-torchscript-model-using-cpp/59743/10

Motivation

General motivation:

  • serialize model to disc allows two-way interchangeability of environments between Python and C++ (loading the models is supported in C++)
  • two-way interchangeability between C++ and Python significantly speeds up debugging of the production C++ code

My particular use case:

  1. I train seq2seq model in Pytorch
  2. I save the jit traced model for a single step of decoding seq2seq to disc
  3. I load the model in C++ Libtorch, run viterbi/beam-search decoding implemented in C++ using the single-step forward method as a single step of decoding.
  4. I fine-tune the C++ hyperparameters/ production setup
  5. I want to serialize the torchscript model in C++ from the production (together with my hyperparameters)
  6. The bundle of saved torchscript model and the hyperparameters are shipped to production

Pitch

Please implement C++ version of https://pytorch.org/docs/stable/jit.html#torch.jit.save

Alternatives

  • Workaround for my particular use case is to keep in the memory the binary data loaded from the model directly "serialized" from python PyTorch code.
    • Note: this workaround fails if we need to adapt the deployed model using a little enrolment data in our C++ application - our plan. For that, we would need to switch back to Python environment which is not possible (imagine mobile phones/ servers)
  • I am interested in other workarounds ... feel free to suggest them

cc @suo

@facebook-github-bot facebook-github-bot added the oncall: jit Add this issue/PR to JIT oncall triage queue label Mar 26, 2020
@oplatek
Copy link
Author

oplatek commented Mar 26, 2020

Related - serialization of tensors between C++ and Python: #20356 (comment)

@eellison
Copy link
Contributor

@driazati this should be possible right ?

@driazati
Copy link
Contributor

Yeah we already have the ability to save to a std::ostream in C++

https://github.com/pytorch/pytorch/blob/master/torch/csrc/jit/api/module.h#L208

@oplatek does this API do what you want or are you looking for something else?

@oplatek
Copy link
Author

oplatek commented Mar 27, 2020

@driazati this is it! I do not know how I missed it.

@driazati
Copy link
Contributor

Cool, closing this issue then. Feel free to reopen if you run into more problems.

@fkxie
Copy link

fkxie commented Oct 5, 2020

@oplatek Hi, would you please tell me how did you solve the problem?

Thanks in advance!

@oplatek
Copy link
Author

oplatek commented Oct 5, 2020 via email

@fkxie
Copy link

fkxie commented Oct 6, 2020 via email

@maxfiedler
Copy link

I am bit confused:
in Dec 2019 yf225 wrote that saving a model written in the C++ front end to TorchScript Module is not possible: https://discuss.pytorch.org/t/how-to-save-torchscript-model-using-cpp/59743/10?u=mxf

Is this a way to do this now? Or am I misunderstanding the use-case that you are pursuing here?

@maxfiedler
Copy link

and what is the purpose of the _save_for_mobile functions in https://github.com/pytorch/pytorch/blob/master/torch/csrc/jit/api/module.h ?

@Stefan-1313
Copy link

Stefan-1313 commented Jul 21, 2021

Yeah we already have the ability to save to a std::ostream in C++

https://github.com/pytorch/pytorch/blob/master/torch/csrc/jit/api/module.h#L208

@oplatek does this API do what you want or are you looking for something else?

I'm trying to do the same thing, but using the save using std::ostream does not work for me. I get a corrupted file which I cannot load (The load using std::istream even gives an exception).

Can you provide me with an example? Thanks, I would be very grateful!

@tom-huntington
Copy link

tom-huntington commented Oct 15, 2021

I am bit confused: in Dec 2019 yf225 wrote that saving a model written in the C++ front end to TorchScript Module is not possible: https://discuss.pytorch.org/t/how-to-save-torchscript-model-using-cpp/59743/10?u=mxf

Is this a way to do this now? Or am I misunderstanding the use-case that you are pursuing here?

There is no way to take a model in c++ that inherits from torch::nn::Module and trace it to create a torch::jit::Module.

But you can load a torch::jit::Module and then save it again in c++,

torch::jit::Module module = torch::jit::load("mlp.pt");
module.save("cppSavedJitModule.pt");
module = torch::jit::load("cppSavedJitModule.pt");

Edit: If you really need to export your c++ model to onnx, you can actually create a torch::jit::Module by hand, using torch::jit::tracer::trace().
See https://github.com/hasktorch/hasktorch/blob/a752f4c868caf386aeefafec9ab97e5810728f50/libtorch-ffi/src/Torch/Internal/Unmanaged/Type/Module.hs#L271
and https://github.com/pytorch/pytorch/blob/master/test/cpp/jit/test_autodiff.cpp#L152
You have to:

for (auto& p : model->parameters())
    {
        p.set_requires_grad(false);
    }

so you cant train the resulting jit module, but the inference works in c++, I didn't actually try converting to onnx though. Also I wasn't able to vary the batch size in the resulting jit module. Control flow probably wont work too

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
oncall: jit Add this issue/PR to JIT oncall triage queue
Projects
None yet
Development

No branches or pull requests

8 participants