Skip to content
This repository was archived by the owner on Aug 28, 2024. It is now read-only.

Conversation

@jeffxtang
Copy link
Contributor

@jeffxtang jeffxtang commented Sep 10, 2021

No description provided.


### 3. Change the iOS code

In `InferenceModule.mm`, first change `#import <LibTorch/LibTorch.h>` to `#import <Libtorch-Lite/Libtorch-Lite.h>`, then change `@protected torch::jit::script::Module _impl;` to `@protected torch::jit::mobile::Module _impl;` and `_impl = torch::jit::load(filePath.UTF8String);` to `_impl = torch::jit::_load_for_mobile(filePath.UTF8String);`.
Copy link
Contributor

@xta0 xta0 Sep 13, 2021

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

why @Protected? What would be the problem is we make it as a private ivar?

Copy link
Contributor Author

@jeffxtang jeffxtang Sep 13, 2021

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

i got it from the HelloWorld example https://github.com/pytorch/ios-demo-app/blob/master/HelloWorld/HelloWorld/HelloWorld/TorchBridge/TorchModule.mm#L5 last year i think and never really thought about it - even tutorials like https://pytorch.org/tutorials/recipes/mobile_interpreter.html refer to the protected - should we change it or leave it for consistency?

Copy link
Contributor

@xta0 xta0 left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

LGTM

@xta0 xta0 merged commit 294fa7b into pytorch:master Sep 27, 2021
Sign up for free to subscribe to this conversation on GitHub. Already have an account? Sign in.

Projects

None yet

Development

Successfully merging this pull request may close these issues.

3 participants