Skip to content

Conversation

iseeyuan
Copy link
Contributor

@iseeyuan iseeyuan commented Jan 25, 2020

Stack from ghstack:

Export the "_save_for_mobile" method to Python so that the bytecode format for lite interpreter can be added or updated to the original script model.

It's the first step of python binding for lite interpreter, as discussed in this internal post and offline.

Next step is to export the load_for_mobile and run method of mobile module, so that users could verify the mobile model from Python.

Test: use the following python script to display the bytecode part of the updated model file.

#!/usr/bin/env python3
import sys
import pickle
import pprint
import zipfile


class FakeObject(object):
    def __init__(self, module, name, args):
        self.module = module
        self.name = name
        self.args = args
        self.state = None

    def __repr__(self):
        state_str = "" if self.state is None else f"(state={self.state!r})"
        return f"{self.module}.{self.name}{self.args!r}{state_str}"

    def __setstate__(self, state):
        self.state = state


class FakeClass(object):
    def __init__(self, module, name):
        self.module = module
        self.name = name
        self.__new__ = self.fake_new

    def __repr__(self):
        return f"{self.module}.{self.name}"

    def __call__(self, *args):
        return FakeObject(self.module, self.name, args)

    def fake_new(self, *args):
        return FakeObject(self.module, self.name, args)


class DumpUnpickler(pickle._Unpickler):
    def find_class(self, module, name):
        return FakeClass(module, name)

    def persistent_load(self, pid):
        return FakeObject("pers", "obj", (pid,))


def main(argv):
    zfile = zipfile.ZipFile(argv[1])
    names = [i for i in zfile.namelist() if "bytecode.pkl" in i]
    if not names: 
        print("bytecode.pkl not found.")
        return
    with zfile.open(names[0], "r") as handle:
        value = DumpUnpickler(handle).load()
    pprint.pprint(value)


if __name__ == "__main__":
    sys.exit(main(sys.argv))

Differential Revision: D19596359

@iseeyuan iseeyuan requested a review from apaszke as a code owner January 25, 2020 17:26
iseeyuan pushed a commit that referenced this pull request Jan 25, 2020
ghstack-source-id: 89a215d
Pull Request resolved: #32621
@facebook-github-bot facebook-github-bot added the oncall: jit Add this issue/PR to JIT oncall triage queue label Jan 25, 2020
@kostmo
Copy link
Member

kostmo commented Jan 25, 2020

💊 CircleCI build failures summary and remediations

As of commit 18b2b37:

None of the build failures appear to be your fault.

  • 1/1 recognized as flaky ❄️
    • Re-run these jobs?

Detailed failure analysis

One may explore the probable reasons each build failed interactively on the Dr. CI website.

❄️ 1 failure recognized as flaky

The following build failures have been detected as flaky and may not be your fault:

See CircleCI build pytorch_linux_xenial_cuda10_1_cudnn7_py3_NO_AVX_NO_AVX2_test (1/1)

Step: "Test" (full log | pattern match details) ❄️

Jan 27 19:01:07 RuntimeError: quantization scale should be > 0
Jan 27 19:01:07   File "test_quantization.py", line 722, in test_quantized_rnn 
Jan 27 19:01:07     y, (h, c) = cell_dq(x, (h, c)) 
Jan 27 19:01:07   File "/opt/conda/lib/python3.6/site-packages/torch/nn/modules/module.py", line 540, in __call__ 
Jan 27 19:01:07     result = self.forward(*input, **kwargs) 
Jan 27 19:01:07   File "/opt/conda/lib/python3.6/site-packages/torch/nn/quantized/dynamic/modules/rnn.py", line 373, in forward 
Jan 27 19:01:07     return self.forward_tensor(input, hx) 
Jan 27 19:01:07   File "/opt/conda/lib/python3.6/site-packages/torch/nn/quantized/dynamic/modules/rnn.py", line 334, in forward_tensor 
Jan 27 19:01:07     input, hx, batch_sizes, max_batch_size, sorted_indices) 
Jan 27 19:01:07   File "/opt/conda/lib/python3.6/site-packages/torch/nn/quantized/dynamic/modules/rnn.py", line 315, in forward_impl 
Jan 27 19:01:07     self.batch_first, dtype=self.dtype, use_dynamic=True) 
Jan 27 19:01:07 RuntimeError: quantization scale should be > 0 
Jan 27 19:01:07  
Jan 27 19:01:07 ---------------------------------------------------------------------- 
Jan 27 19:01:07 Ran 36 tests in 54.070s 
Jan 27 19:01:07  
Jan 27 19:01:07 FAILED (errors=1, skipped=1) 
Jan 27 19:01:07  
Jan 27 19:01:07 Generating XML reports... 
Jan 27 19:01:07 Traceback (most recent call last): 
Jan 27 19:01:07   File "test/run_test.py", line 456, in <module> 
Jan 27 19:01:07     main() 

This comment was automatically generated by Dr. CI (expand for details).Follow this link to opt-out of these comments for your Pull Requests.

Please report bugs/suggestions on the GitHub issue tracker.

This comment has been revised 5 times.

iseeyuan pushed a commit that referenced this pull request Jan 27, 2020
ghstack-source-id: d43158c
Pull Request resolved: #32621
@iseeyuan iseeyuan changed the title Python binding for lite interpreter Python binding to export bytecode format for lite interpreter Jan 27, 2020
…ter"


Export the "_save_for_mobile" method to Python so that the bytecode format for lite interpreter can be added or updated to the original script model. 

It's the first step of python binding for lite interpreter, as discussed in this [internal post](https://fb.workplace.com/groups/1144215345733672/permalink/1478900738931796/) and offline. 

Next step is to export the load_for_mobile and run method of mobile module, so that users could verify the mobile model from Python. 

Test: use the following python script to display the bytecode part of the updated model file. 
```
#!/usr/bin/env python3
import sys
import pickle
import pprint
import zipfile


class FakeObject(object):
    def __init__(self, module, name, args):
        self.module = module
        self.name = name
        self.args = args
        self.state = None

    def __repr__(self):
        state_str = "" if self.state is None else f"(state={self.state!r})"
        return f"{self.module}.{self.name}{self.args!r}{state_str}"

    def __setstate__(self, state):
        self.state = state


class FakeClass(object):
    def __init__(self, module, name):
        self.module = module
        self.name = name
        self.__new__ = self.fake_new

    def __repr__(self):
        return f"{self.module}.{self.name}"

    def __call__(self, *args):
        return FakeObject(self.module, self.name, args)

    def fake_new(self, *args):
        return FakeObject(self.module, self.name, args)


class DumpUnpickler(pickle._Unpickler):
    def find_class(self, module, name):
        return FakeClass(module, name)

    def persistent_load(self, pid):
        return FakeObject("pers", "obj", (pid,))


def main(argv):
    zfile = zipfile.ZipFile(argv[1])
    names = [i for i in zfile.namelist() if "bytecode.pkl" in i]
    if not names: 
        print("bytecode.pkl not found.")
        return
    with zfile.open(names[0], "r") as handle:
        value = DumpUnpickler(handle).load()
    pprint.pprint(value)


if __name__ == "__main__":
    sys.exit(main(sys.argv))

```
iseeyuan pushed a commit that referenced this pull request Jan 27, 2020
ghstack-source-id: 323e2e0
Pull Request resolved: #32621
@facebook-github-bot
Copy link
Contributor

@iseeyuan merged this pull request in c64dec1.

"""
return self._c.save(*args, **kwargs)

def save_for_mobile(self, *args, **kwargs):
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Should this be part of the public API for ScriptModule? I thought we were not exposing the lite interpreter externally, so this may be confusing for external users.

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I thought the "_" is needed in C++ only. I might missed it here. Just to confirm: should we rename it to "_save_for_mobile"? I could have a quick patch for it.

wuhuikx pushed a commit to wuhuikx/pytorch that referenced this pull request Jan 30, 2020
…h#32621)

Summary:
Pull Request resolved: pytorch#32621

Export the "_save_for_mobile" method to Python so that the bytecode format for lite interpreter can be added or updated to the original script model.

It's the first step of python binding for lite interpreter, as discussed in this [internal post](https://fb.workplace.com/groups/1144215345733672/permalink/1478900738931796/) and offline.

Next step is to export the load_for_mobile and run method of mobile module, so that users could verify the mobile model from Python.

Test: use the following python script to display the bytecode part of the updated model file.
```
#!/usr/bin/env python3
import sys
import pickle
import pprint
import zipfile

class FakeObject(object):
    def __init__(self, module, name, args):
        self.module = module
        self.name = name
        self.args = args
        self.state = None

    def __repr__(self):
        state_str = "" if self.state is None else f"(state={self.state!r})"
        return f"{self.module}.{self.name}{self.args!r}{state_str}"

    def __setstate__(self, state):
        self.state = state

class FakeClass(object):
    def __init__(self, module, name):
        self.module = module
        self.name = name
        self.__new__ = self.fake_new

    def __repr__(self):
        return f"{self.module}.{self.name}"

    def __call__(self, *args):
        return FakeObject(self.module, self.name, args)

    def fake_new(self, *args):
        return FakeObject(self.module, self.name, args)

class DumpUnpickler(pickle._Unpickler):
    def find_class(self, module, name):
        return FakeClass(module, name)

    def persistent_load(self, pid):
        return FakeObject("pers", "obj", (pid,))

def main(argv):
    zfile = zipfile.ZipFile(argv[1])
    names = [i for i in zfile.namelist() if "bytecode.pkl" in i]
    if not names:
        print("bytecode.pkl not found.")
        return
    with zfile.open(names[0], "r") as handle:
        value = DumpUnpickler(handle).load()
    pprint.pprint(value)

if __name__ == "__main__":
    sys.exit(main(sys.argv))

```

Test Plan: Imported from OSS

Differential Revision: D19596359

Pulled By: iseeyuan

fbshipit-source-id: 19a4a771320f95217f5b0f031c2c04db7b4079a8
iseeyuan pushed a commit that referenced this pull request Jan 30, 2020
It's a patch to #32621, make the api private.
facebook-github-bot pushed a commit that referenced this pull request Jan 31, 2020
Summary:
Pull Request resolved: #32771

It's a patch to #32621, make the api private.

Test Plan: Imported from OSS

Differential Revision: D19657307

Pulled By: iseeyuan

fbshipit-source-id: e604a0cbed6a1e61413daaafc65bea92b90f1f5d
@facebook-github-bot facebook-github-bot deleted the gh/iseeyuan/44/head branch February 1, 2020 15:16
ttumiel pushed a commit to ttumiel/pytorch that referenced this pull request Mar 4, 2020
…h#32621)

Summary:
Pull Request resolved: pytorch#32621

Export the "_save_for_mobile" method to Python so that the bytecode format for lite interpreter can be added or updated to the original script model.

It's the first step of python binding for lite interpreter, as discussed in this [internal post](https://fb.workplace.com/groups/1144215345733672/permalink/1478900738931796/) and offline.

Next step is to export the load_for_mobile and run method of mobile module, so that users could verify the mobile model from Python.

Test: use the following python script to display the bytecode part of the updated model file.
```
#!/usr/bin/env python3
import sys
import pickle
import pprint
import zipfile

class FakeObject(object):
    def __init__(self, module, name, args):
        self.module = module
        self.name = name
        self.args = args
        self.state = None

    def __repr__(self):
        state_str = "" if self.state is None else f"(state={self.state!r})"
        return f"{self.module}.{self.name}{self.args!r}{state_str}"

    def __setstate__(self, state):
        self.state = state

class FakeClass(object):
    def __init__(self, module, name):
        self.module = module
        self.name = name
        self.__new__ = self.fake_new

    def __repr__(self):
        return f"{self.module}.{self.name}"

    def __call__(self, *args):
        return FakeObject(self.module, self.name, args)

    def fake_new(self, *args):
        return FakeObject(self.module, self.name, args)

class DumpUnpickler(pickle._Unpickler):
    def find_class(self, module, name):
        return FakeClass(module, name)

    def persistent_load(self, pid):
        return FakeObject("pers", "obj", (pid,))

def main(argv):
    zfile = zipfile.ZipFile(argv[1])
    names = [i for i in zfile.namelist() if "bytecode.pkl" in i]
    if not names:
        print("bytecode.pkl not found.")
        return
    with zfile.open(names[0], "r") as handle:
        value = DumpUnpickler(handle).load()
    pprint.pprint(value)

if __name__ == "__main__":
    sys.exit(main(sys.argv))

```

Test Plan: Imported from OSS

Differential Revision: D19596359

Pulled By: iseeyuan

fbshipit-source-id: 19a4a771320f95217f5b0f031c2c04db7b4079a8
ttumiel pushed a commit to ttumiel/pytorch that referenced this pull request Mar 4, 2020
Summary:
Pull Request resolved: pytorch#32771

It's a patch to pytorch#32621, make the api private.

Test Plan: Imported from OSS

Differential Revision: D19657307

Pulled By: iseeyuan

fbshipit-source-id: e604a0cbed6a1e61413daaafc65bea92b90f1f5d
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

Merged oncall: jit Add this issue/PR to JIT oncall triage queue

Projects

None yet

Development

Successfully merging this pull request may close these issues.

6 participants