Skip to content

Conversation

linbinyu
Copy link
Contributor

Summary:
This diff moved the XNNPACK buck build to a shared build file in xplat/caffe2/third_party, so it can be reused by OSS buck CI in the future. There's no functionality change.

Background: as we are moving to github-first, we want community to receive more signals from our internal build. XNNPACK is part of pytorch mobile build so we want to add it to OSS BUCK CI.

How it works: all XNNPACK targets are defined in xplat/caffe2/third_party/xnnpack_defs.bzl. When we build it internally, the XNNPACK source is still at xplat/third-party/XNNPACK and we will load that bzl file in xplat/third-party/XNNPACK/BUCK. Everything should work as before. In OSS build, XNNPACK is a submodule in xplat/caffe2/third_party and we will load the same bzl file in pytorch/third_party/BUILD.buck.

Wrapper Generation: the wrapper generation script is moved to xplat/caffe2/third_party/generate-xnnpack-wrappers.py. It will take an optional argument for the path of XNNPACK (they are different in internal build and OSS build). The wrapper files will always be generated at the parent folder of XNNPACK source. But the src_defs.bzl and wrapper_defs.bzl will always be in xplat/caffe2/third_party/ (they are now called xnnpack_src_defs.bzl and xnnpack_wrapper_defs.bzl). For OSS build this script will only be used in CI, and the generated files will not be committed.

Next Steps: Once landed, I will try to build XNNPACK in OSS BUCK using xnnpack_defs.bzl. Meta-specific symbols need to be resolved, so there will be some refactors to the build file.

Test Plan: buck build xplat/third-party/XNNPACK:XNNPACK

Differential Revision: D36529332

@facebook-github-bot
Copy link
Contributor

facebook-github-bot commented May 20, 2022

🔗 Helpful links

❌ 1 New Failures

As of commit 31aac6f (more details on the Dr. CI page):

Expand to see more
  • 1/1 failures introduced in this PR

🕵️ 1 new failure recognized by patterns

The following CI failures do not appear to be due to upstream breakages

See GitHub Actions build pull / linux-xenial-cuda11.3-py3.7-gcc7 / test (default, 2, 4, linux.4xlarge.nvidia.gpu) (1/1)

Step: "Test" (full log | diagnosis details | 🔁 rerun)

2022-05-28T00:40:24.3304981Z FAIL [3.007s]: test_mem_leak (__main__.TestProfilerCUDA)
2022-05-28T00:40:24.2454094Z STAGE:2022-05-28 00:40:24 5176:5176 ActivityProfilerController.cpp:300] Completed Stage: Collection
2022-05-28T00:40:24.2629656Z ok (0.069s)
2022-05-28T00:40:24.2657814Z   test_datapipe_with_record_function_fork (__main__.TestRecordFunction) ... STAGE:2022-05-28 00:40:24 5176:5176 ActivityProfilerController.cpp:294] Completed Stage: Warm Up
2022-05-28T00:40:24.3086616Z STAGE:2022-05-28 00:40:24 5176:5176 ActivityProfilerController.cpp:300] Completed Stage: Collection
2022-05-28T00:40:24.3242722Z ok (0.061s)
2022-05-28T00:40:24.3274350Z   test_record_function (__main__.TestRecordFunction) ... STAGE:2022-05-28 00:40:24 5176:5176 ActivityProfilerController.cpp:294] Completed Stage: Warm Up
2022-05-28T00:40:24.3290426Z STAGE:2022-05-28 00:40:24 5176:5176 ActivityProfilerController.cpp:300] Completed Stage: Collection
2022-05-28T00:40:24.3303194Z ok (0.006s)
2022-05-28T00:40:24.3303967Z 
2022-05-28T00:40:24.3304358Z ======================================================================
2022-05-28T00:40:24.3304981Z FAIL [3.007s]: test_mem_leak (__main__.TestProfilerCUDA)
2022-05-28T00:40:24.3305772Z Checks that there's no memory leak when using profiler with CUDA
2022-05-28T00:40:24.3306507Z ----------------------------------------------------------------------
2022-05-28T00:40:24.3307319Z Traceback (most recent call last):
2022-05-28T00:40:24.3308316Z   File "/opt/conda/lib/python3.7/site-packages/torch/testing/_internal/common_device_type.py", line 1287, in wrap_fn
2022-05-28T00:40:24.3308734Z     return fn(self, *args, **kwargs)
2022-05-28T00:40:24.3309084Z   File "test_profiler.py", line 69, in test_mem_leak
2022-05-28T00:40:24.3309943Z     msg='memory usage is increasing, {}'.format(str(last_rss)))
2022-05-28T00:40:24.3310898Z AssertionError: False is not true : memory usage is increasing, deque([2066194432, 2066272256, 2066276352, 2066280448, 2067730432], maxlen=5)
2022-05-28T00:40:24.3311481Z 
2022-05-28T00:40:24.3311827Z ----------------------------------------------------------------------

This comment was automatically generated by Dr. CI (expand for details).

Please report bugs/suggestions to the (internal) Dr. CI Users group.

Click here to manually regenerate this comment.

@facebook-github-bot
Copy link
Contributor

This pull request was exported from Phabricator. Differential Revision: D36529332

1 similar comment
@facebook-github-bot
Copy link
Contributor

This pull request was exported from Phabricator. Differential Revision: D36529332

@facebook-github-bot
Copy link
Contributor

This pull request was exported from Phabricator. Differential Revision: D36529332

1 similar comment
@facebook-github-bot
Copy link
Contributor

This pull request was exported from Phabricator. Differential Revision: D36529332

@facebook-github-bot
Copy link
Contributor

This pull request was exported from Phabricator. Differential Revision: D36529332

1 similar comment
@facebook-github-bot
Copy link
Contributor

This pull request was exported from Phabricator. Differential Revision: D36529332

@linbinyu linbinyu force-pushed the export-D36529332 branch from d90b630 to ba11db3 Compare May 20, 2022 19:45
@facebook-github-bot
Copy link
Contributor

This pull request was exported from Phabricator. Differential Revision: D36529332

@facebook-github-bot
Copy link
Contributor

This pull request was exported from Phabricator. Differential Revision: D36529332

@linbinyu linbinyu force-pushed the export-D36529332 branch from 8839f42 to 8db4e79 Compare May 26, 2022 23:57
@facebook-github-bot
Copy link
Contributor

This pull request was exported from Phabricator. Differential Revision: D36529332

@facebook-github-bot
Copy link
Contributor

This pull request was exported from Phabricator. Differential Revision: D36529332

@linbinyu linbinyu force-pushed the export-D36529332 branch from 49f913a to 1e40065 Compare May 27, 2022 22:41
…rch#77941)

Summary:
Pull Request resolved: pytorch#77941

This diff moved the XNNPACK buck build to a shared build file in xplat/caffe2/third_party, so it can be reused by OSS buck CI in the future. There's no functionality change.

**Why**: We want to expose more internal build signals to OSS. XNNPACK is part of pytorch mobile build so it needs to be included in OSS BUCK CI. By moving these BUCK targets to a file under caffe2 folder, we can use it in OSS BUCK build (see pytorch#76480).

**How it works**: all XNNPACK targets are defined in xplat/caffe2/third_party/xnnpack_defs.bzl. When we build it internally, the XNNPACK source is still at xplat/third-party/XNNPACK and we will load that bzl file in xplat/third-party/XNNPACK/BUCK. Everything should work as before. In OSS build, XNNPACK is a submodule in xplat/caffe2/third_party and we will load the same bzl file in pytorch/third_party/BUILD.buck.

**Wrapper Generation**: the wrapper generation script is moved to xplat/caffe2/third_party/generate-xnnpack-wrappers.py. It will take an optional argument for the path of XNNPACK (they are different in internal build and OSS build). The wrapper files will always be generated at the parent folder of XNNPACK source. But the src_defs.bzl and wrapper_defs.bzl will always be in xplat/caffe2/third_party/ (they are now called xnnpack_src_defs.bzl and xnnpack_wrapper_defs.bzl). For OSS build this script will only be used in CI, and the generated files will not be committed.

**Next Steps:** Once landed, I will try to build XNNPACK in OSS BUCK using xnnpack_defs.bzl. Meta-specific symbols need to be resolved, so there will be some refactors to the build file.

Test Plan:
- buck build xplat/third-party/XNNPACK:XNNPACK
- Tested wrapper generation script for internal build

Reviewed By: kimishpatel, malfet, seemethere

Differential Revision: D36529332

fbshipit-source-id: dfcc723d55ab335a77f5c914513daba60388ad44
@facebook-github-bot
Copy link
Contributor

This pull request was exported from Phabricator. Differential Revision: D36529332

@linbinyu linbinyu force-pushed the export-D36529332 branch from 1e40065 to 31aac6f Compare May 27, 2022 22:47
@facebook-github-bot
Copy link
Contributor

@pytorchbot merge

(Initiating merge automatically since Phabricator Diff has merged)

@github-actions
Copy link
Contributor

Hey @linbinyu.
You've committed this PR, but it does not have both a 'release notes: ...' and 'topics: ...' label. Please add one of each to the PR. The 'release notes: ...' label should represent the part of PyTorch that this PR changes (fx, autograd, distributed, etc) and the 'topics: ...' label should represent the kind of PR it is (not user facing, new feature, bug fix, perf improvement, etc). The list of valid labels can be found here for the 'release notes: ...' and here for the 'topics: ...'.
For changes that are 'topic: not user facing' there is no need for a release notes label.

facebook-github-bot pushed a commit that referenced this pull request May 28, 2022
Summary:
Pull Request resolved: #77941

This diff moved the XNNPACK buck build to a shared build file in xplat/caffe2/third_party, so it can be reused by OSS buck CI in the future. There's no functionality change.

**Why**: We want to expose more internal build signals to OSS. XNNPACK is part of pytorch mobile build so it needs to be included in OSS BUCK CI. By moving these BUCK targets to a file under caffe2 folder, we can use it in OSS BUCK build (see #76480).

**How it works**: all XNNPACK targets are defined in xplat/caffe2/third_party/xnnpack_defs.bzl. When we build it internally, the XNNPACK source is still at xplat/third-party/XNNPACK and we will load that bzl file in xplat/third-party/XNNPACK/BUCK. Everything should work as before. In OSS build, XNNPACK is a submodule in xplat/caffe2/third_party and we will load the same bzl file in pytorch/third_party/BUILD.buck.

**Wrapper Generation**: the wrapper generation script is moved to xplat/caffe2/third_party/generate-xnnpack-wrappers.py. It will take an optional argument for the path of XNNPACK (they are different in internal build and OSS build). The wrapper files will always be generated at the parent folder of XNNPACK source. But the src_defs.bzl and wrapper_defs.bzl will always be in xplat/caffe2/third_party/ (they are now called xnnpack_src_defs.bzl and xnnpack_wrapper_defs.bzl). For OSS build this script will only be used in CI, and the generated files will not be committed.

**Next Steps:** Once landed, I will try to build XNNPACK in OSS BUCK using xnnpack_defs.bzl. Meta-specific symbols need to be resolved, so there will be some refactors to the build file.

Test Plan:
- buck build xplat/third-party/XNNPACK:XNNPACK
- Tested wrapper generation script for internal build

Reviewed By: kimishpatel, malfet, seemethere

Differential Revision: D36529332

fbshipit-source-id: bfe77bd753336f788e11754fddd3886b0e308650
@suo
Copy link
Member

suo commented May 28, 2022

@pytorchbot revert -m "broke buck builds on master, see https://hud.pytorch.org/pytorch/pytorch/commit/b8b46f932b41b5b5969d49fbe5ea8d2d5b45a1e3" -c nosignal

@pytorchmergebot
Copy link
Collaborator

Reverting PR 77941 failed due to This PR has internal changes and must be landed via Phabricator
Raised by https://github.com/pytorch/pytorch/actions/runs/2401708089

@facebook-github-bot
Copy link
Contributor

@pytorchbot revert -m="Diff Reverted" -c="ghfirst"

This Pull Request has been reverted by a revert inside Meta. To re-land this change, please open another pull request, assign the same reviewers, fix the CI failures that caused the revert and make sure that the failing CI runs on the PR by applying the proper ciflow label (e.g., ciflow/trunk).)

@pytorch-bot
Copy link

pytorch-bot bot commented May 29, 2022

Revert unsuccessful: please retry the command and provide a revert reason, e.g. @pytorchbot revert -m="this breaks mac tests on trunk" -c="ignoredsignal". See the wiki for more details on the commands.

@suo
Copy link
Member

suo commented May 29, 2022

@pytorchbot revert -m="Diff reverted due to breaking buck internal builds" -c="ghfirst"

@pytorchmergebot
Copy link
Collaborator

Reverting PR 77941 failed due to This PR has internal changes and must be landed via Phabricator
Raised by https://github.com/pytorch/pytorch/actions/runs/2403011256

@malfet
Copy link
Contributor

malfet commented May 30, 2022

This change were reverted 2 days ago, but revert have not made it to fbsync branch yet
cc: @bigfootjon

malfet added a commit that referenced this pull request May 30, 2022
This reverts commit b8b46f9.

This change were reverted internally but has not been populated to
neither `master` nor `fbsync` branches yet
@malfet
Copy link
Contributor

malfet commented May 30, 2022

Manually reverted as 472d67a

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Projects

None yet

Development

Successfully merging this pull request may close these issues.

6 participants