Skip to content

Conversation

@shunting314
Copy link
Contributor

@shunting314 shunting314 commented Oct 12, 2021

Stack from ghstack:

TLDR, we are able to use the interactive_embedded_interpreter (basically just torch::deploy interpreter with an interactive shell) to dynamicly load various third party libraries. We use the popular libraries numpy, scipy, regex, pandas for illustration purpose.

A couple of changes need to be done for the interactive_embedded_interpreter:
1, we need link with :embedded_interpreter_all rather than :embedded_interpreter so we can enable DEEPBIND and use our custom loader
2, we provide a pylibRoot path to construct the InterpreterManager. The path will be added to the embedded interpreter's sys.path. Typically we can pass in the python library root path in a conda environment so torch::deploy interpreter can find all installed packages.
3, we allow interactive_embedded_interpreter execute a script to ease recording the exploration of various python libraries.

Differential Revision: D31587278

NOTE FOR REVIEWERS: This PR has internal Facebook specific changes or comments, please review them on Phabricator!

…rd-party libraries

TLDR, we are able to use the interactive_embedded_interpreter (basically just torch::deploy interpreter with an interactive shell) to dynamicly load various third party libraries. We use the popular libraries numpy, scipy, regex, pandas for illustration purpose.

A couple of changes need to be done for the interactive_embedded_interpreter:
1, we need link with :embedded_interpreter_all rather than :embedded_interpreter so we can enable DEEPBIND and use our custom loader
2, we provide a pylibRoot path to construct the InterpreterManager. The path will be added to the embedded interpreter's sys.path. Typically we can pass in the python library root path in a conda environment so torch::deploy interpreter can find all installed packages.
3, we allow interactive_embedded_interpreter execute a script to ease recording the exploration of various python libraries.

Differential Revision: [D31587278](https://our.internmc.facebook.com/intern/diff/D31587278/)

**NOTE FOR REVIEWERS**: This PR has internal Facebook specific changes or comments, please review them on [Phabricator](https://our.internmc.facebook.com/intern/diff/D31587278/)!

[ghstack-poisoned]
@pytorch-probot
Copy link

pytorch-probot bot commented Oct 12, 2021

CI Flow Status

⚛️ CI Flow

Ruleset - Version: v1
Ruleset - File: https://github.com/pytorch/pytorch/blob/ed9f31168b515c1461cc398a97c81588a9dd8f76/.github/generated-ciflow-ruleset.json
PR ciflow labels: ciflow/default

Workflows Labels (bold enabled) Status
Triggered Workflows
linux-bionic-py3.6-clang9 ciflow/all, ciflow/cpu, ciflow/default, ciflow/linux, ciflow/noarch, ciflow/xla ✅ triggered
linux-vulkan-bionic-py3.6-clang9 ciflow/all, ciflow/cpu, ciflow/default, ciflow/linux, ciflow/vulkan ✅ triggered
linux-xenial-cuda11.3-py3.6-gcc7 ciflow/all, ciflow/cuda, ciflow/default, ciflow/linux ✅ triggered
linux-xenial-py3.6-clang7-asan ciflow/all, ciflow/cpu, ciflow/default, ciflow/linux, ciflow/sanitizers ✅ triggered
linux-xenial-py3.6-clang7-onnx ciflow/all, ciflow/cpu, ciflow/default, ciflow/linux, ciflow/onnx ✅ triggered
linux-xenial-py3.6-gcc5.4 ciflow/all, ciflow/cpu, ciflow/default, ciflow/linux ✅ triggered
linux-xenial-py3.6-gcc7-bazel-test ciflow/all, ciflow/bazel, ciflow/cpu, ciflow/default, ciflow/linux ✅ triggered
win-vs2019-cpu-py3 ciflow/all, ciflow/cpu, ciflow/default, ciflow/win ✅ triggered
win-vs2019-cuda11.3-py3 ciflow/all, ciflow/cuda, ciflow/default, ciflow/win ✅ triggered
Skipped Workflows
libtorch-linux-xenial-cuda10.2-py3.6-gcc7 ciflow/all, ciflow/cuda, ciflow/libtorch, ciflow/linux 🚫 skipped
libtorch-linux-xenial-cuda11.3-py3.6-gcc7 ciflow/all, ciflow/cuda, ciflow/libtorch, ciflow/linux 🚫 skipped
linux-bionic-cuda10.2-py3.9-gcc7 ciflow/all, ciflow/cuda, ciflow/linux, ciflow/slow 🚫 skipped
linux-xenial-cuda10.2-py3.6-gcc7 ciflow/all, ciflow/cuda, ciflow/linux, ciflow/slow 🚫 skipped
parallelnative-linux-xenial-py3.6-gcc5.4 ciflow/all, ciflow/cpu, ciflow/linux 🚫 skipped
periodic-libtorch-linux-xenial-cuda11.1-py3.6-gcc7 ciflow/all, ciflow/cuda, ciflow/libtorch, ciflow/linux, ciflow/scheduled 🚫 skipped
periodic-linux-xenial-cuda10.2-py3-gcc7-slow-gradcheck ciflow/all, ciflow/cuda, ciflow/linux, ciflow/scheduled, ciflow/slow, ciflow/slow-gradcheck 🚫 skipped
periodic-linux-xenial-cuda11.1-py3.6-gcc7 ciflow/all, ciflow/cuda, ciflow/linux, ciflow/scheduled 🚫 skipped
periodic-win-vs2019-cuda11.1-py3 ciflow/all, ciflow/cuda, ciflow/scheduled, ciflow/win 🚫 skipped
puretorch-linux-xenial-py3.6-gcc5.4 ciflow/all, ciflow/cpu, ciflow/linux 🚫 skipped

You can add a comment to the PR and tag @pytorchbot with the following commands:
# ciflow rerun, "ciflow/default" will always be added automatically
@pytorchbot ciflow rerun

# ciflow rerun with additional labels "-l <ciflow/label_name>", which is equivalent to adding these labels manually and trigger the rerun
@pytorchbot ciflow rerun -l ciflow/scheduled -l ciflow/slow

For more information, please take a look at the CI Flow Wiki.

@facebook-github-bot
Copy link
Contributor

facebook-github-bot commented Oct 12, 2021

🔗 Helpful links

💊 CI failures summary and remediations

As of commit ed9f311 (more details on the Dr. CI page):



🕵️ 1 new failure recognized by patterns

The following CI failures do not appear to be due to upstream breakages:

See GitHub Actions build Lint / clang-tidy (1/1)

Step: "Check for warnings" (full log | diagnosis details | 🔁 rerun)

2021-10-13T07:31:58.5987131Z /__w/pytorch/pytor...logging.h' file not found [clang-diagnostic-error]
2021-10-13T07:31:58.5976774Z Processing 2 clang-tidy jobs
2021-10-13T07:31:58.5978497Z /__w/pytorch/pytorch/torch/csrc/deploy/interactive_embedded_interpreter.cpp:6:10: error: 'glog/logging.h' file not found [clang-diagnostic-error]
2021-10-13T07:31:58.5979602Z #include <glog/logging.h>
2021-10-13T07:31:58.5980252Z          ^
2021-10-13T07:31:58.5980734Z Warnings detected!
2021-10-13T07:31:58.5981247Z Summary:
2021-10-13T07:31:58.5982268Z [clang-diagnostic-error] occurred 1 times
2021-10-13T07:31:58.5983149Z     /__w/pytorch/pytorch/torch/csrc/deploy/interactive_embedded_interpreter.cpp:6
2021-10-13T07:31:58.5983855Z 
2021-10-13T07:31:58.5984604Z Processing 38 clang-tidy jobs
2021-10-13T07:31:58.5987131Z /__w/pytorch/pytorch/torch/csrc/deploy/interactive_embedded_interpreter.cpp:6:10: error: 'glog/logging.h' file not found [clang-diagnostic-error]
2021-10-13T07:31:58.5988165Z #include <glog/logging.h>
2021-10-13T07:31:58.5989819Z          ^
2021-10-13T07:31:58.5990413Z Warnings detected!
2021-10-13T07:31:58.5990956Z Summary:
2021-10-13T07:31:58.5992073Z [clang-diagnostic-error] occurred 1 times
2021-10-13T07:31:58.5993178Z     /__w/pytorch/pytorch/torch/csrc/deploy/interactive_embedded_interpreter.cpp:6
2021-10-13T07:31:58.5993900Z 
2021-10-13T07:31:58.5994643Z Please fix the above clang-tidy warnings.
2021-10-13T07:31:58.6025078Z ##[error]Process completed with exit code 1.
2021-10-13T07:31:58.6099299Z ##[group]Run pytorch/add-annotations-github-action@master

1 failure not recognized by patterns:

Job Step Action
CircleCI pytorch_linux_xenial_py3_clang5_mobile_build Build 🔁 rerun

❄️ 1 failure tentatively classified as flaky

but reruns have not yet been triggered to confirm:

See CircleCI build pytorch_ios_12_5_1_x86_64_build (1/1)

Step: "Build" (full log | diagnosis details | 🔁 rerun) ❄️

Failed to recurse into submodule path 'third_party/onnx-tensorrt'
remote: Total 0 (delta 0), reused 0 (delta 0), pack-reused 0        
remote: Enumerating objects: 610, done.        
remote: Counting objects:   0% (1/610)        
remote: Counting objects:   1% (7/610)        
remote: Counting objects:   2% (13/610)        
remote: Counting objects:   3% (19/610)        
remote: Counting objects:   4% (25/610)        
remote: Counting objects:   5% (31/610)        
remote: Counting objects:   6% (37/610)        
remote: Counting objects:   7% (43/610)        
remote: Counting objects:   8% (49/610)        
remote: Counting objects:   9% (55/610)        
remote: Counting objects:  10% (61/610)        
remote: Counting objects:  11% (68/610)        
remote: Counting objects:  12% (74/610)        
remote: Counting objects:  13% (80/610)        
remote: Counting objects:  14% (86/610)        
remote: Counting objects:  15% (92/610)        
remote: Counting objects:  16% (98/610)        
remote: Counting objects:  17% (104/610)        
remote: Counting objects:  18% (110/610)        
remote: Counting objects:  19% (116/610)        
remote: Counting objects:  20% (122/610)        
remote: Counting objects:  21% (129/610)        
remote: Counting objects:  22% (135/610)        
remote: Counting objects:  23% (141/610)        
remote: Counting objects:  24% (147/610)        
remote: Counting objects:  25% (153/610)        
remote: Counting objects:  26% (159/610)        
remote: Counting objects:  27% (165/610)        
remote: Counting objects:  28% (171/610)        
remote: Counting objects:  29% (177/610)        
remote: Counting objects:  30% (183/610)        
remote: Counting objects:  31% (190/610)        
remote: Counting objects:  32% (196/610)        
remote: Counting objects:  33% (202/610)        
remote: Counting objects:  34% (208/610)        
remote: Counting objects:  35% (214/610)        
remote: Counting objects:  36% (220/610)        
remote: Counting objects:  37% (226/610)        
remote: Counting objects:  38% (232/610)        
remote: Counting objects:  39% (238/610)        
remote: Counting objects:  40% (244/610)        
remote: Counting objects:  41% (251/610)        
remote: Counting objects:  42% (257/610)        
remote: Counting objects:  43% (263/610)        
remote: Counting objects:  44% (269/610)        
remote: Counting objects:  45% (275/610)        
remote: Counting objects:  46% (281/610)        
remote: Counting objects:  47% (287/610)        
remote: Counting objects:  48% (293/610)        
remote: Counting objects:  49% (299/610)        
remote: Counting objects:  50% (305/610)        
remote: Counting objects:  51% (312/610)        
remote: Counting objects:  52% (318/610)        
remote: Counting objects:  53% (324/610)        
remote: Counting objects:  54% (330/610)        
remote: Counting objects:  55% (336/610)        
remote: Counting objects:  56% (342/610)        
remote: Counting objects:  57% (348/610)        
remote: Counting objects:  58% (354/610)        
remote: Counting objects:  59% (360/610)        
remote: Counting objects:  60% (366/610)        
remote: Counting objects:  61% (373/610)        
remote: Counting objects:  62% (379/610)        
remote: Counting objects:  63% (385/610)        
remote: Counting objects:  64% (391/610)        
remote: Counting objects:  65% (397/610)        
remote: Counting objects:  66% (403/610)        
remote: Counting objects:  67% (409/610)        
remote: Counting objects:  68% (415/610)        
remote: Counting objects:  69% (421/610)        
remote: Counting objects:  70% (427/610)        
remote: Counting objects:  71% (434/610)        
remote: Counting objects:  72% (440/610)        
remote: Counting objects:  73% (446/610)        
remote: Counting objects:  74% (452/610)        
remote: Counting objects:  75% (458/610)        
remote: Counting objects:  76% (464/610)        
remote: Counting objects:  77% (470/610)        
remote: Counting objects:  78% (476/610)        
remote: Counting objects:  79% (482/610)        
remote: Counting objects:  80% (488/610)        
remote: Counting objects:  81% (495/610)        
remote: Counting objects:  82% (501/610)        
remote: Counting objects:  83% (507/610)        
remote: Counting objects:  84% (513/610)        
remote: Counting objects:  85% (519/610)        
remote: Counting objects:  86% (525/610)        
remote: Counting objects:  87% (531/610)        
remote: Counting objects:  88% (537/610)        
remote: Counting objects:  89% (543/610)        
remote: Counting objects:  90% (549/610)        
remote: Counting objects:  91% (556/610)        
remote: Counting objects:  92% (562/610)        
remote: Counting objects:  93% (568/610)        
remote: Counting objects:  94% (574/610)        
remote: Counting objects:  95% (580/610)        
remote: Counting objects:  96% (586/610)        
remote: Counting objects:  97% (592/610)        
remote: Counting objects:  98% (598/610)        
remote: Counting objects:  99% (604/610)        
remote: Counting objects: 100% (610/610)        
remote: Counting objects: 100% (610/610), done.        
remote: Compressing objects:   0% (1/323)        
remote: Compressing objects:   1% (4/323)        
remote: Compressing objects:   2% (7/323)        
remote: Compressing objects:   3% (10/323)        
remote: Compressing objects:   4% (13/323)        
remote: Compressing objects:   5% (17/323)        
remote: Compressing objects:   6% (20/323)        
remote: Compressing objects:   7% (23/323)        
remote: Compressing objects:   8% (26/323)        
remote: Compressing objects:   9% (30/323)        
remote: Compressing objects:  10% (33/323)        
remote: Compressing objects:  11% (36/323)        
remote: Compressing objects:  12% (39/323)        
remote: Compressing objects:  13% (42/323)        
remote: Compressing objects:  14% (46/323)        
remote: Compressing objects:  15% (49/323)        
remote: Compressing objects:  16% (52/323)        
remote: Compressing objects:  17% (55/323)        
remote: Compressing objects:  18% (59/323)        
remote: Compressing objects:  19% (62/323)        
remote: Compressing objects:  20% (65/323)        
remote: Compressing objects:  21% (68/323)        
remote: Compressing objects:  22% (72/323)        
remote: Compressing objects:  23% (75/323)        
remote: Compressing objects:  24% (78/323)        
remote: Compressing objects:  25% (81/323)        
remote: Compressing objects:  26% (84/323)        
remote: Compressing objects:  27% (88/323)        
remote: Compressing objects:  28% (91/323)        
remote: Compressing objects:  29% (94/323)        
remote: Compressing objects:  30% (97/323)        
remote: Compressing objects:  31% (101/323)        
remote: Compressing objects:  32% (104/323)        
remote: Compressing objects:  33% (107/323)        
remote: Compressing objects:  34% (110/323)        
remote: Compressing objects:  35% (114/323)        
remote: Compressing objects:  36% (117/323)        
remote: Compressing objects:  37% (120/323)        
remote: Compressing objects:  38% (123/323)        
remote: Compressing objects:  39% (126/323)        
remote: Compressing objects:  40% (130/323)        
remote: Compressing objects:  41% (133/323)        
remote: Compressing objects:  42% (136/323)        
remote: Compressing objects:  43% (139/323)        
remote: Compressing objects:  44% (143/323)        
remote: Compressing objects:  45% (146/323)        
remote: Compressing objects:  46% (149/323)        
remote: Compressing objects:  47% (152/323)        
remote: Compressing objects:  48% (156/323)        
remote: Compressing objects:  49% (159/323)        
remote: Compressing objects:  50% (162/323)        
remote: Compressing objects:  51% (165/323)        
remote: Compressing objects:  52% (168/323)        
remote: Compressing objects:  53% (172/323)        
remote: Compressing objects:  54% (175/323)        
remote: Compressing objects:  55% (178/323)        
remote: Compressing objects:  56% (181/323)        
remote: Compressing objects:  57% (185/323)        
remote: Compressing objects:  58% (188/323)        
remote: Compressing objects:  59% (191/323)        
remote: Compressing objects:  60% (194/323)        
remote: Compressing objects:  61% (198/323)        
remote: Compressing objects:  62% (201/323)        
remote: Compressing objects:  63% (204/323)        
remote: Compressing objects:  64% (207/323)        
remote: Compressing objects:  65% (210/323)        
remote: Compressing objects:  66% (214/323)        
remote: Compressing objects:  67% (217/323)        
remote: Compressing objects:  68% (220/323)        
remote: Compressing objects:  69% (223/323)        
remote: Compressing objects:  70% (227/323)        
remote: Compressing objects:  71% (230/323)        
remote: Compressing objects:  72% (233/323)        
remote: Compressing objects:  73% (236/323)        
remote: Compressing objects:  74% (240/323)        
remote: Compressing objects:  75% (243/323)        
remote: Compressing objects:  76% (246/323)        
remote: Compressing objects:  77% (249/323)        
remote: Compressing objects:  78% (252/323)        
remote: Compressing objects:  79% (256/323)        
remote: Compressing objects:  80% (259/323)        
remote: Compressing objects:  81% (262/323)        
remote: Compressing objects:  82% (265/323)        
remote: Compressing objects:  83% (269/323)        
remote: Compressing objects:  84% (272/323)        
remote: Compressing objects:  85% (275/323)        
remote: Compressing objects:  86% (278/323)        
remote: Compressing objects:  87% (282/323)        
remote: Compressing objects:  88% (285/323)        
remote: Compressing objects:  89% (288/323)        
remote: Compressing objects:  90% (291/323)        
remote: Compressing objects:  91% (294/323)        
remote: Compressing objects:  92% (298/323)        
remote: Compressing objects:  93% (301/323)        
remote: Compressing objects:  94% (304/323)        
remote: Compressing objects:  95% (307/323)        
remote: Compressing objects:  96% (311/323
Receiving objects:   0% (1/370)
Receiving objects:   1% (4/370)
Receiving objects:   2% (8/370)
Receiving objects:   3% (12/370)
Receiving objects:   4% (15/370)
Receiving objects:   5% (19/370)
Receiving objects:   6% (23/370)
Receiving objects:   7% (26/370)
Receiving objects:   8% (30/370)
Receiving objects:   9% (34/370)
Receiving objects:  10% (37/370)
Receiving objects:  11% (41/370)
Receiving objects:  12% (45/370)
Receiving objects:  13% (49/370)
Receiving objects:  14% (52/370)
Receiving objects:  15% (56/370)
Receiving objects:  16% (60/370)
Receiving objects:  17% (63/370)
Receiving objects:  18% (67/370)
Receiving objects:  19% (71/370)
Receiving objects:  20% (74/370)
Receiving objects:  21% (78/370)
Receiving objects:  22% (82/370)
Receiving objects:  23% (86/370)
Receiving objects:  24% (89/370)
Receiving objects:  25% (93/370)
Receiving objects:  26% (97/370)
Receiving objects:  27% (100/370)
Receiving objects:  28% (104/370)
Receiving objects:  29% (108/370)
Receiving objects:  30% (111/370)
Receiving objects:  31% (115/370)
Receiving objects:  32% (119/370)
Receiving objects:  33% (123/370)
Receiving objects:  34% (126/370)
Receiving objects:  35% (130/370)
Receiving objects:  36% (134/370)
Receiving objects:  37% (137/370)
Receiving objects:  38% (141/370)
Receiving objects:  39% (145/370)
Receiving objects:  40% (148/370)
Receiving objects:  41% (152/370)
Receiving objects:  42% (156/370)
Receiving objects:  43% (160/370)
Receiving objects:  44% (163/370)
Receiving objects:  45% (167/370)
Receiving objects:  46% (171/370)
Receiving objects:  47% (174/370)
Receiving objects:  48% (178/370)
Receiving objects:  49% (182/370)
Receiving objects:  50% (185/370)
Receiving objects:  51% (189/370)
Receiving objects:  52% (193/370)
Receiving objects:  53% (197/370)
Receiving objects:  54% (200/370)
Receiving objects:  55% (204/370)
Receiving objects:  56% (208/370)
Receiving objects:  57% (211/370)
Receiving objects:  58% (215/370)
Receiving objects:  59% (219/370)
Receiving objects:  60% (222/370)
Receiving objects:  61% (226/370)
Receiving objects:  62% (230/370)
Receiving objects:  63% (234/370)
Receiving objects:  64% (237/370)
Receiving objects:  65% (241/370)
Receiving objects:  66% (245/370)
Receiving objects:  67% (248/370)
Receiving objects:  68% (252/370)
Receiving objects:  69% (256/370)
Receiving objects:  70% (259/370)
Receiving objects:  71% (263/370)
Receiving objects:  72% (267/370)
Receiving objects:  73% (271/370)
Receiving objects:  74% (274/370)
Receiving objects:  75% (278/370)
Receiving objects:  76% (282/370)
Receiving objects:  77% (285/370)
Receiving objects:  78% (289/370)
Receiving objects:  79% (293/370)
Receiving objects:  80% (296/370)
Receiving objects:  81% (300/370)
Receiving objects:  82% (304/370)
Receiving objects:  83% (308/370)
Receiving objects:  84% (311/370)
Receiving objects:  85% (315/370)
Receiving objects:  86% (319/370)
Receiving objects:  87% (322/370)
Receiving objects:  88% (326/370)
Receiving objects:  89% (330/370)
Receiving objects:  90% (333/370)
Receiving objects:  91% (337/370)
Receiving objects:  92% (341/370)
Receiving objects:  93% (345/370)
Receiving objects:  94% (348/370)
remote: Total 370 (delta 199), reused 107 (delta 23), pack-reused 0        
Receiving objects:  95% (352/370)
Receiving objects:  96% (356/370)
Receiving objects:  97% (359/370)
Receiving objects:  98% (363/370)
Receiving objects:  99% (367/370)
Receiving objects: 100% (370/370)
Receiving objects: 100% (370/370), 1.19 MiB | 13.05 MiB/s, done.
Resolving deltas:   0% (0/199)
Resolving deltas:   1% (2/199)
Resolving deltas:   2% (4/199)
Resolving deltas:   3% (6/199)
Resolving deltas:   4% (8/199)
Resolving deltas:   5% (10/199)
Resolving deltas:   6% (12/199)
Resolving deltas:   7% (14/199)
Resolving deltas:   8% (16/199)
Resolving deltas:   9% (18/199)
Resolving deltas:  10% (20/199)
Resolving deltas:  11% (22/199)
Resolving deltas:  12% (24/199)
Resolving deltas:  13% (26/199)
Resolving deltas:  14% (28/199)
Resolving deltas:  15% (30/199)
Resolving deltas:  16% (32/199)
Resolving deltas:  17% (34/199)
Resolving deltas:  18% (36/199)
Resolving deltas:  19% (38/199)
Resolving deltas:  20% (40/199)
Resolving deltas:  21% (42/199)
Resolving deltas:  22% (44/199)
Resolving deltas:  23% (46/199)
Resolving deltas:  24% (48/199)
Resolving deltas:  25% (50/199)
Resolving deltas:  26% (52/199)
Resolving deltas:  27% (54/199)
Resolving deltas:  28% (56/199)
Resolving deltas:  29% (58/199)
Resolving deltas:  30% (60/199)
Resolving deltas:  31% (62/199)
Resolving deltas:  32% (64/199)
Resolving deltas:  33% (66/199)
Resolving deltas:  34% (68/199)
Resolving deltas:  35% (70/199)
Resolving deltas:  36% (72/199)
Resolving deltas:  37% (74/199)
Resolving deltas:  38% (76/199)
Resolving deltas:  39% (78/199)
Resolving deltas:  40% (80/199)
Resolving deltas:  41% (82/199)
Resolving deltas:  42% (84/199)
Resolving deltas:  43% (86/199)
Resolving deltas:  44% (88/199)
Resolving deltas:  45% (90/199)
Resolving deltas:  46% (92/199)
Resolving deltas:  47% (94/199)
Resolving deltas:  48% (96/199)
Resolving deltas:  49% (98/199)
Resolving deltas:  50% (100/199)
Resolving deltas:  51% (102/199)
Resolving deltas:  52% (104/199)
Resolving deltas:  53% (106/199)
Resolving deltas:  54% (108/199)
Resolving deltas:  55% (110/199)
Resolving deltas:  56% (112/199)
Resolving deltas:  57% (114/199)
Resolving deltas:  58% (116/199)
Resolving deltas:  59% (118/199)
Resolving deltas:  60% (120/199)
Resolving deltas:  61% (122/199)
Resolving deltas:  62% (124/199)
Resolving deltas:  63% (126/199)
Resolving deltas:  64% (128/199)
Resolving deltas:  65% (130/199)
Resolving deltas:  66% (132/199)
Resolving deltas:  67% (134/199)
Resolving deltas:  68% (136/199)
Resolving deltas:  69% (138/199)
Resolving deltas:  70% (140/199)
Resolving deltas:  71% (142/199)
Resolving deltas:  72% (144/199)
Resolving deltas:  73% (146/199)
Resolving deltas:  74% (148/199)
Resolving deltas:  75% (150/199)
Resolving deltas:  76% (152/199)
Resolving deltas:  77% (154/199)
Resolving deltas:  78% (156/199)
Resolving deltas:  79% (158/199)
Resolving deltas:  80% (160/199)
Resolving deltas:  81% (162/199)
Resolving deltas:  82% (164/199)
Resolving deltas:  83% (166/199)
Resolving deltas:  84% (168/199)
Resolving deltas:  85% (170/199)
Resolving deltas:  86% (172/199)
Resolving deltas:  87% (174/199)
Resolving deltas:  88% (176/199)
Resolving deltas:  89% (178/199)
Resolving deltas:  90% (180/199)
Resolving deltas:  91% (182/199)
Resolving deltas:  92% (184/199)
Resolving deltas:  93% (186/199)
Resolving deltas:  94% (188/199)
Resolving deltas:  95% (190/199)
Resolving deltas:  96% (192/199)
Resolving deltas:  97% (194/199)
Resolving deltas:  98% (196/199)
Resolving deltas:  99% (198/199)
Resolving deltas: 100% (199/199)
Resolving deltas: 100% (199/199), completed with 151 local objects.
From ssh://github.com/facebook/zstd
 * branch            aec56a52fbab207fc639a1937d1e708a282edca8 -> FETCH_HEAD
Submodule path 'third_party/zstd': checked out 'aec56a52fbab207fc639a1937d1e708a282edca8'
Failed to recurse into submodule path 'third_party/onnx-tensorrt'


Exited with code exit status 1


This comment was automatically generated by Dr. CI (expand for details).Follow this link to opt-out of these comments for your Pull Requests.

Please report bugs/suggestions to the (internal) Dr. CI Users group.

Click here to manually regenerate this comment.

Copy link
Member

@suo suo left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

nice! Some small nits but otherwise looks good

InterpreterManager::InterpreterManager(size_t nInterp) : resources_(nInterp) {
InterpreterManager::InterpreterManager(
size_t nInterp,
const std::string& pylibRoot)
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

this hsould probably be called pythonPath or something, which is a more conventional way of describing what this string is for.

explicit InterpreterManager(size_t nInterp = 2);
explicit InterpreterManager(
size_t nInterp = 2,
const std::string& pylibRoot = "");
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

c10::optional<string> I think is more explicit than using an empty string as the null value.

…various third-party libraries"

TLDR, we are able to use the interactive_embedded_interpreter (basically just torch::deploy interpreter with an interactive shell) to dynamicly load various third party libraries. We use the popular libraries numpy, scipy, regex, pandas for illustration purpose.

A couple of changes need to be done for the interactive_embedded_interpreter:
1, we need link with :embedded_interpreter_all rather than :embedded_interpreter so we can enable DEEPBIND and use our custom loader
2, we provide a pylibRoot path to construct the InterpreterManager. The path will be added to the embedded interpreter's sys.path. Typically we can pass in the python library root path in a conda environment so torch::deploy interpreter can find all installed packages.
3, we allow interactive_embedded_interpreter execute a script to ease recording the exploration of various python libraries.

Differential Revision: [D31587278](https://our.internmc.facebook.com/intern/diff/D31587278/)

**NOTE FOR REVIEWERS**: This PR has internal Facebook specific changes or comments, please review them on [Phabricator](https://our.internmc.facebook.com/intern/diff/D31587278/)!

[ghstack-poisoned]
shunting314 added a commit that referenced this pull request Oct 12, 2021
…rd-party libraries

Pull Request resolved: #66512

TLDR, we are able to use the interactive_embedded_interpreter (basically just torch::deploy interpreter with an interactive shell) to dynamicly load various third party libraries. We use the popular libraries numpy, scipy, regex, pandas for illustration purpose.

A couple of changes need to be done for the interactive_embedded_interpreter:
1, we need link with :embedded_interpreter_all rather than :embedded_interpreter so we can enable DEEPBIND and use our custom loader
2, we provide a pylibRoot path to construct the InterpreterManager. The path will be added to the embedded interpreter's sys.path. Typically we can pass in the python library root path in a conda environment so torch::deploy interpreter can find all installed packages.
3, we allow interactive_embedded_interpreter execute a script to ease recording the exploration of various python libraries.
ghstack-source-id: 140415316

Differential Revision: [D31587278](https://our.internmc.facebook.com/intern/diff/D31587278/)

**NOTE FOR REVIEWERS**: This PR has internal Facebook specific changes or comments, please review them on [Phabricator](https://our.internmc.facebook.com/intern/diff/D31587278/)!
…various third-party libraries"

TLDR, we are able to use the interactive_embedded_interpreter (basically just torch::deploy interpreter with an interactive shell) to dynamicly load various third party libraries. We use the popular libraries numpy, scipy, regex, pandas for illustration purpose.

A couple of changes need to be done for the interactive_embedded_interpreter:
1, we need link with :embedded_interpreter_all rather than :embedded_interpreter so we can enable DEEPBIND and use our custom loader
2, we provide a pylibRoot path to construct the InterpreterManager. The path will be added to the embedded interpreter's sys.path. Typically we can pass in the python library root path in a conda environment so torch::deploy interpreter can find all installed packages.
3, we allow interactive_embedded_interpreter execute a script to ease recording the exploration of various python libraries.

Differential Revision: [D31587278](https://our.internmc.facebook.com/intern/diff/D31587278/)

**NOTE FOR REVIEWERS**: This PR has internal Facebook specific changes or comments, please review them on [Phabricator](https://our.internmc.facebook.com/intern/diff/D31587278/)!

[ghstack-poisoned]
shunting314 added a commit that referenced this pull request Oct 12, 2021
…rd-party libraries

Pull Request resolved: #66512

TLDR, we are able to use the interactive_embedded_interpreter (basically just torch::deploy interpreter with an interactive shell) to dynamicly load various third party libraries. We use the popular libraries numpy, scipy, regex, pandas for illustration purpose.

A couple of changes need to be done for the interactive_embedded_interpreter:
1, we need link with :embedded_interpreter_all rather than :embedded_interpreter so we can enable DEEPBIND and use our custom loader
2, we provide a pylibRoot path to construct the InterpreterManager. The path will be added to the embedded interpreter's sys.path. Typically we can pass in the python library root path in a conda environment so torch::deploy interpreter can find all installed packages.
3, we allow interactive_embedded_interpreter execute a script to ease recording the exploration of various python libraries.
ghstack-source-id: 140417244

Differential Revision: [D31587278](https://our.internmc.facebook.com/intern/diff/D31587278/)

**NOTE FOR REVIEWERS**: This PR has internal Facebook specific changes or comments, please review them on [Phabricator](https://our.internmc.facebook.com/intern/diff/D31587278/)!
@shunting314
Copy link
Contributor Author

Clang-tidy job fail on torch/csrc/deploy/interactive_embedded_interpreter.cpp , complaining <glog/logging.h> not found. I guess we need add interactive_embedded_interpreter.cpp to the OSS build to fix that. I have a separate PR for that.

…various third-party libraries"

TLDR, we are able to use the interactive_embedded_interpreter (basically just torch::deploy interpreter with an interactive shell) to dynamicly load various third party libraries. We use the popular libraries numpy, scipy, regex, pandas for illustration purpose.

A couple of changes need to be done for the interactive_embedded_interpreter:
1, we need link with :embedded_interpreter_all rather than :embedded_interpreter so we can enable DEEPBIND and use our custom loader
2, we provide a pylibRoot path to construct the InterpreterManager. The path will be added to the embedded interpreter's sys.path. Typically we can pass in the python library root path in a conda environment so torch::deploy interpreter can find all installed packages.
3, we allow interactive_embedded_interpreter execute a script to ease recording the exploration of various python libraries.

Differential Revision: [D31587278](https://our.internmc.facebook.com/intern/diff/D31587278/)

**NOTE FOR REVIEWERS**: This PR has internal Facebook specific changes or comments, please review them on [Phabricator](https://our.internmc.facebook.com/intern/diff/D31587278/)!

[ghstack-poisoned]
shunting314 added a commit that referenced this pull request Oct 13, 2021
…rd-party libraries

Pull Request resolved: #66512

TLDR, we are able to use the interactive_embedded_interpreter (basically just torch::deploy interpreter with an interactive shell) to dynamicly load various third party libraries. We use the popular libraries numpy, scipy, regex, pandas for illustration purpose.

A couple of changes need to be done for the interactive_embedded_interpreter:
1, we need link with :embedded_interpreter_all rather than :embedded_interpreter so we can enable DEEPBIND and use our custom loader
2, we provide a pylibRoot path to construct the InterpreterManager. The path will be added to the embedded interpreter's sys.path. Typically we can pass in the python library root path in a conda environment so torch::deploy interpreter can find all installed packages.
3, we allow interactive_embedded_interpreter execute a script to ease recording the exploration of various python libraries.
ghstack-source-id: 140453213

Differential Revision: [D31587278](https://our.internmc.facebook.com/intern/diff/D31587278/)

**NOTE FOR REVIEWERS**: This PR has internal Facebook specific changes or comments, please review them on [Phabricator](https://our.internmc.facebook.com/intern/diff/D31587278/)!
@suo suo mentioned this pull request Oct 13, 2021
@facebook-github-bot facebook-github-bot deleted the gh/shunting314/5/head branch October 17, 2021 14:27
wconstab pushed a commit that referenced this pull request Oct 20, 2021
…rd-party libraries (#66512)

Summary:
Pull Request resolved: #66512

TLDR, we are able to use the interactive_embedded_interpreter (basically just torch::deploy interpreter with an interactive shell) to dynamicly load various third party libraries. We use the popular libraries numpy, scipy, regex, pandas for illustration purpose.

A couple of changes need to be done for the interactive_embedded_interpreter:
1, we need link with :embedded_interpreter_all rather than :embedded_interpreter so we can enable DEEPBIND and use our custom loader
2, we provide a pylibRoot path to construct the InterpreterManager. The path will be added to the embedded interpreter's sys.path. Typically we can pass in the python library root path in a conda environment so torch::deploy interpreter can find all installed packages.
3, we allow interactive_embedded_interpreter execute a script to ease recording the exploration of various python libraries.
ghstack-source-id: 140453213

Test Plan:
Install numpy, scipy, regex, pandas in the conda environment or on the machine directly. Suppose /home/shunting/.local/lib/python3.8/site-packages/ is the root path for the installed libraries.

- buck run mode/opt :interactive_embedded_interpreter -- --pylib_root=/home/shunting/.local/lib/python3.8/site-packages/ --pyscript=~/p7/iei_examples/try_regex.py
content of try_regex.py:
```
import regex

print(regex)
pat = r'(.+)\1'
print(regex.match(pat, "abcabc"))
print(regex.match(pat, "abcba"))

print("bye")
```

- buck run mode/opt :interactive_embedded_interpreter -- --pylib_root=/home/shunting/.local/lib/python3.8/site-packages/ --pyscript=~/p7/iei_examples/try_numpy.py
content of try_numpy.py:
```
import numpy as np
print(f"numpy at {np}")
a = np.random.rand(2, 3)
b = np.random.rand(3, 2)
print(np.matmul(a, b))
```

- buck run mode/opt :interactive_embedded_interpreter -- --pylib_root=/home/shunting/.local/lib/python3.8/site-packages/ --pyscript=~/p7/iei_examples/try_scipy.py
content of try_scipy.py:
```
import numpy as np
from scipy import linalg

mat_a = np.array([[1, 0, 0, 0], [1, 1, 0, 0], [1, 2, 1, 0], [1, 3, 3, 1]])
mat_b = linalg.inv(mat_a)
print(mat_b)
```

- buck run mode/opt :interactive_embedded_interpreter -- --pylib_root=/home/shunting/.local/lib/python3.8/site-packages/ --pyscript=~/p7/iei_examples/try_pandas.py
content of try_pandas.py:
```
import pandas as pd
print(f"pandas at {pd}")
df = pd.DataFrame({
  "col1": [1, 2, 3, 4],
  "col2": [2, 4, 8, 16],
})
print(df)
```

Reviewed By: suo

Differential Revision: D31587278

fbshipit-source-id: c0b031c1fa71a77cdfeba1d04514f83127f79012
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Projects

None yet

Development

Successfully merging this pull request may close these issues.

3 participants