Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We鈥檒l occasionally send you account related emails.

Already on GitHub? Sign in to your account

[JIT] Compile group of functions/modules #32390

Open
edgarriba opened this issue Jan 18, 2020 · 4 comments
Open

[JIT] Compile group of functions/modules #32390

edgarriba opened this issue Jan 18, 2020 · 4 comments
Labels
oncall: jit Add this issue/PR to JIT oncall triage queue triaged This issue has been looked at a team member, and triaged and prioritized into an appropriate module

Comments

@edgarriba
Copy link
Contributor

edgarriba commented Jan 18, 2020

馃殌 Feature

I would like to be able to jit compile a group of functions or modules in order to serialize the obtained library in a single file so that later can be loaded in C++.

Motivation

The main motivation for this feature is to port Kornia to C++ avoiding the massive effort to re-implement the whole thing from scratch in C++. We expect to do at some point in the future but as a first iteration I think we could bootstrap the process of compiling from python to C++ and producing something functional out of the box.

In addition, this could somehow being extended to port to other frameworks by serializing into ONNX format.

Any thoughts about how to proceed with this will be very welcomed :)

cc @suo

@facebook-github-bot facebook-github-bot added the oncall: jit Add this issue/PR to JIT oncall triage queue label Jan 18, 2020
@edgarriba
Copy link
Contributor Author

edgarriba commented Jan 18, 2020

/cc @suo @fmassa @soumith

@zdevito
Copy link
Contributor

zdevito commented Jan 19, 2020

If you put your functions, modules into a top-level module, and annotate them with export, then all of them will get saved:

class Top(torch.nn.Module):
   def __init__(self_:
      # mark the functions in these modules you want to save with @torch.jit.export
      self.submodule1 = ...
      self.submodule2 = ...
  @torch.jit.export
  def func1():
    ...
   @torch.jit.export
   def func2():
      ...

If this is not sufficient for what you want to save, can you elaborate more on your use case?

@zdevito zdevito added the triaged This issue has been looked at a team member, and triaged and prioritized into an appropriate module label Jan 19, 2020
@edgarriba
Copy link
Contributor Author

Sounds like It could work. Then from the C++ side I guess I will have to provide an interface for each of the exported functions, right? Which brings me to question of how the saved binary with functions works - Is it loaded at running time or can be cross compiled and linked to my C++ like a normal library ?

@suo
Copy link
Member

suo commented Feb 15, 2020

It will be loaded at runtime.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
oncall: jit Add this issue/PR to JIT oncall triage queue triaged This issue has been looked at a team member, and triaged and prioritized into an appropriate module
Projects
None yet
Development

No branches or pull requests

4 participants