-
Notifications
You must be signed in to change notification settings - Fork 25.6k
Skip buffer in dense update #148533
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Skip buffer in dense update #148533
Conversation
🔗 Helpful Links🧪 See artifacts and rendered test results at hud.pytorch.org/pr/148533
Note: Links to docs will display an error until the docs builds have been completed. ❌ 1 New Failure, 4 Unrelated FailuresAs of commit bd38921 with merge base 65dbc3b ( NEW FAILURE - The following job has failed:
BROKEN TRUNK - The following jobs failed but were present on the merge base:👉 Rebase onto the `viable/strict` branch to avoid these failures
UNSTABLE - The following job is marked as unstable, possibly due to flakiness on trunk:
This comment was automatically generated by Dr. CI and updates every 15 minutes. |
This pull request was exported from Phabricator. Differential Revision: D69553929 |
This pull request was exported from Phabricator. Differential Revision: D69553929 |
1c09477
to
4fdcf4d
Compare
@pytorchbot label "topic: not user facing" |
4fdcf4d
to
640ad27
Compare
Summary: as title. PyTorch Module buffer will not be published in delta publishing. In Quinn's previous diff, constant type annotations have been introduced. In addition to skip constant, we also need to skip buffer if it is not found in the user-provided delta weights list Test Plan: https://docs.google.com/document/d/1wiqUo0PyZ4g6YJIJlL_LE084ZEuE74iu74gZjqGGjWY/edit?tab=t.0#heading=h.dby6cwiw1xrn Reviewed By: 22quinn Differential Revision: D69553929
This pull request was exported from Phabricator. Differential Revision: D69553929 |
640ad27
to
157705f
Compare
This pull request was exported from Phabricator. Differential Revision: D69553929 |
157705f
to
c84db7a
Compare
c84db7a
to
896abd9
Compare
This pull request was exported from Phabricator. Differential Revision: D69553929 |
896abd9
to
f295549
Compare
f295549
to
0df6d23
Compare
|
This pull request was exported from Phabricator. Differential Revision: D69553929 |
0df6d23
to
be0a9c6
Compare
d5ac2f4
to
61ec87c
Compare
61ec87c
to
6c1ff6b
Compare
6c1ff6b
to
a2f6516
Compare
This pull request was exported from Phabricator. Differential Revision: D69553929 |
1 similar comment
This pull request was exported from Phabricator. Differential Revision: D69553929 |
Summary: Pull Request resolved: pytorch#148533 as title. PyTorch Module buffer will not be published in delta publishing. In Quinn's previous diff, constant type annotations have been introduced. In addition to skip constant, we also need to skip buffer if it is not found in the user-provided delta weights list Test Plan: https://docs.google.com/document/d/1wiqUo0PyZ4g6YJIJlL_LE084ZEuE74iu74gZjqGGjWY/edit?tab=t.0#heading=h.dby6cwiw1xrn Reviewed By: jingsh, 22quinn Differential Revision: D69553929
5d84204
to
2d1c30e
Compare
This pull request was exported from Phabricator. Differential Revision: D69553929 |
2d1c30e
to
2bcf342
Compare
This pull request was exported from Phabricator. Differential Revision: D69553929 |
Summary: Pull Request resolved: pytorch#148533 as title. PyTorch Module buffer will not be published in delta publishing. In Quinn's previous diff, constant type annotations have been introduced. In addition to skip constant, we also need to skip buffer if it is not found in the user-provided delta weights list Test Plan: https://docs.google.com/document/d/1wiqUo0PyZ4g6YJIJlL_LE084ZEuE74iu74gZjqGGjWY/edit?tab=t.0#heading=h.dby6cwiw1xrn Reviewed By: jingsh, 22quinn Differential Revision: D69553929
433a06a
to
97689b1
Compare
Summary: as title. PyTorch Module buffer will not be published in delta publishing. In Quinn's previous diff, constant type annotations have been introduced. In addition to skip constant, we also need to skip buffer if it is not found in the user-provided delta weights list Test Plan: https://docs.google.com/document/d/1wiqUo0PyZ4g6YJIJlL_LE084ZEuE74iu74gZjqGGjWY/edit?tab=t.0#heading=h.dby6cwiw1xrn Reviewed By: jingsh, 22quinn Differential Revision: D69553929
Summary: Pull Request resolved: pytorch#148533 as title. PyTorch Module buffer will not be published in delta publishing. In Quinn's previous diff, constant type annotations have been introduced. In addition to skip constant, we also need to skip buffer if it is not found in the user-provided delta weights list Test Plan: https://docs.google.com/document/d/1wiqUo0PyZ4g6YJIJlL_LE084ZEuE74iu74gZjqGGjWY/edit?tab=t.0#heading=h.dby6cwiw1xrn Reviewed By: jingsh, 22quinn Differential Revision: D69553929
This pull request was exported from Phabricator. Differential Revision: D69553929 |
97689b1
to
bd38921
Compare
@pytorchbot merge -i (Initiating merge automatically since Phabricator Diff has merged, merging with -i because oss signals were bypassed internally) |
Merge startedYour change will be merged while ignoring the following 5 checks: pull / win-vs2022-cpu-py3 / build, trunk / win-vs2022-cuda12.1-py3 / build, trunk / win-vs2022-cpu-py3 / build, trunk / libtorch-linux-focal-cuda12.4-py3.10-gcc9-debug / build, inductor / cuda12.4-py3.10-gcc9-sm86 / test (inductor_timm, 2, 2, linux.g5.4xlarge.nvidia.gpu) Learn more about merging in the wiki. Questions? Feedback? Please reach out to the PyTorch DevX Team |
Summary:
as title.
PyTorch Module buffer will not be published in delta publishing. In Quinn's previous diff, constant type annotations have been introduced.
In addition to skip constant, we also need to skip buffer if it is not found in the user-provided delta weights list
Test Plan: https://docs.google.com/document/d/1wiqUo0PyZ4g6YJIJlL_LE084ZEuE74iu74gZjqGGjWY/edit?tab=t.0#heading=h.dby6cwiw1xrn
Differential Revision: D69553929