Skip to content

Conversation

Copy link

pytorch-bot bot commented Apr 18, 2024

🔗 Helpful Links

🧪 See artifacts and rendered test results at hud.pytorch.org/pr/124398

Note: Links to docs will display an error until the docs builds have been completed.

✅ You can merge normally! (1 Unrelated Failure)

As of commit 945073e with merge base e16f1ee (image):

BROKEN TRUNK - The following job failed but were present on the merge base:

👉 Rebase onto the `viable/strict` branch to avoid these failures

This comment was automatically generated by Dr. CI and updates every 15 minutes.

@bdhirsh bdhirsh requested a review from anijain2305 April 18, 2024 16:58
@ezyang ezyang requested a review from wanchaol April 20, 2024 00:09
return False

from torch.distributed._tensor.placement_types import Placement

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

@wanchaol Can you remind me why we needed this variable?

Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

This boils down to the "custom fx type" that we've been discussing. So fundamentally we need to inline Placement and DeviceMesh as a constant in the closure function (i.e. from_local) and put that closure function to the fx graph as a call_function node.

But if we don't have this PlacementClassVariable, dynamo would trace the DTensor's metadata construction as a UserDefinedClass/Object (i.e. Shard(1)). UDTs are not constant variable (as it's hard to tell whether a UDT is a ConstantVariable unless user explicitly tell dynamo the objects are ConstantVariable). So this PlacementClassVariable allow us to turn the Shard(1) as a PlacementVariable (which is a constant variable), and then the sharding metadata can be inlined as a closure

Copy link
Collaborator

@wanchaol wanchaol left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

lgtm!

@bdhirsh
Copy link
Contributor Author

bdhirsh commented Apr 22, 2024

@pytorchbot merge

@pytorch-bot pytorch-bot bot added the ciflow/trunk Trigger trunk jobs on your pull request label Apr 22, 2024
@pytorchmergebot
Copy link
Collaborator

Merge started

Your change will be merged once all checks pass (ETA 0-4 Hours).

Learn more about merging in the wiki.

Questions? Feedback? Please reach out to the PyTorch DevX Team

Advanced Debugging
Check the merge workflow status
here

@pytorchmergebot
Copy link
Collaborator

Merge failed

Reason: 1 mandatory check(s) failed. The first few are:

Dig deeper by viewing the failures on hud

Details for Dev Infra team Raised by workflow job

Failing merge rule: Core Maintainers

Fixes an error for torchtitan + internal




cc voznesenskym penguinwu EikanWang jgong5 Guobing-Chen XiaobingSuper zhuhaozhe blzheng wenzhe-nrv jiayisunx chenyang78 kadeng chauhang

[ghstack-poisoned]
Fixes an error for torchtitan + internal




cc voznesenskym penguinwu EikanWang jgong5 Guobing-Chen XiaobingSuper zhuhaozhe blzheng wenzhe-nrv jiayisunx chenyang78 kadeng chauhang

[ghstack-poisoned]
pytorchmergebot pushed a commit that referenced this pull request May 1, 2024
Fixes #125287

Fixes #124090, context on the issue

Pull Request resolved: #124399
Approved by: https://github.com/soulitzer
ghstack dependencies: #124398
pytorchmergebot pushed a commit that referenced this pull request May 1, 2024
pytorchmergebot pushed a commit that referenced this pull request May 1, 2024
…spatch__ (#123347)" (#125288)

Re-land of #123347.

The original PR broke internal because of a circular import due to importing dynamo in the DTensor code. The new version uses `torch._dynamo_disable` to work around

This reverts commit 9d88339.

Pull Request resolved: #125288
Approved by: https://github.com/ezyang, https://github.com/yanboliang, https://github.com/yoyoyocmu, https://github.com/anijain2305, https://github.com/fegin
ghstack dependencies: #124398, #124399, #124400
pytorch-bot bot pushed a commit that referenced this pull request May 3, 2024
pytorch-bot bot pushed a commit that referenced this pull request May 3, 2024
Fixes #125287

Fixes #124090, context on the issue

Pull Request resolved: #124399
Approved by: https://github.com/soulitzer
ghstack dependencies: #124398
petrex pushed a commit to petrex/pytorch that referenced this pull request May 3, 2024
petrex pushed a commit to petrex/pytorch that referenced this pull request May 3, 2024
…spatch__ (pytorch#123347)" (pytorch#125288)

Re-land of pytorch#123347.

The original PR broke internal because of a circular import due to importing dynamo in the DTensor code. The new version uses `torch._dynamo_disable` to work around

This reverts commit 9d88339.

Pull Request resolved: pytorch#125288
Approved by: https://github.com/ezyang, https://github.com/yanboliang, https://github.com/yoyoyocmu, https://github.com/anijain2305, https://github.com/fegin
ghstack dependencies: pytorch#124398, pytorch#124399, pytorch#124400
@github-actions github-actions bot deleted the gh/bdhirsh/554/head branch June 4, 2024 02:01
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Projects

None yet

Development

Successfully merging this pull request may close these issues.

5 participants