Skip to content

Conversation

@miladm
Copy link
Collaborator

@miladm miladm commented Aug 15, 2022

Downstream #pytorch/xla#3888

@miladm miladm requested a review from a team as a code owner August 15, 2022 08:49
@facebook-github-bot
Copy link
Contributor

facebook-github-bot commented Aug 15, 2022

🔗 Helpful links

✅ No Failures (0 Pending)

As of commit 24f11a3 (more details on the Dr. CI page):

Expand to see more

💚 💚 Looks good so far! There are no failures yet. 💚 💚


This comment was automatically generated by Dr. CI (expand for details).

Please report bugs/suggestions to the (internal) Dr. CI Users group.

Click here to manually regenerate this comment.

@miladm miladm self-assigned this Aug 15, 2022
@samdow samdow added the triaged This issue has been looked at a team member, and triaged and prioritized into an appropriate module label Aug 15, 2022
@miladm miladm requested a review from Krovatkin August 15, 2022 17:43
@miladm
Copy link
Collaborator Author

miladm commented Aug 15, 2022

@Krovatkin can I get a stamp here? I can remove the pin once you approved the change. Thanks. I will do the same for the downstream PR in PT/XLA.

@miladm miladm added the lazy Lazy Tensor work items label Aug 15, 2022
Copy link
Contributor

@Krovatkin Krovatkin left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

:shipit:

@Krovatkin
Copy link
Contributor

Krovatkin commented Aug 16, 2022

@miladm we will need to keep the pin to make sure we don't break pytorch/master. Looks like you need to rebase PR to keep your pin cc: @JackCaoG

@JackCaoG
Copy link
Collaborator

JackCaoG commented Aug 16, 2022

I think we can leave the pin in this pr as it is, merge this pr. Remove the pin from the pt/xla pr and merge that pr. After midnight the bot should update pytorch's xla pin to the new master commit then we are good.

@miladm
Copy link
Collaborator Author

miladm commented Aug 17, 2022

@pytorchbot rebase

@miladm
Copy link
Collaborator Author

miladm commented Aug 17, 2022

Thanks for the comments. Will merge after rebase is done.

@pytorchmergebot
Copy link
Collaborator

@pytorchbot successfully started a rebase job. Check the current status here

@pytorchmergebot
Copy link
Collaborator

Rebase failed due to Command git -C /home/runner/work/pytorch/pytorch rebase refs/remotes/origin/master pull/83415/head returned non-zero exit code 1

Rebasing (1/2)
Rebasing (2/2)
Auto-merging .github/ci_commit_pins/xla.txt
CONFLICT (content): Merge conflict in .github/ci_commit_pins/xla.txt
error: could not apply d1e4f2c3fd... temp change to downstream torch_xla branch
hint: Resolve all conflicts manually, mark them as resolved with
hint: "git add/rm <conflicted_files>", then run "git rebase --continue".
hint: You can instead skip this commit: run "git rebase --skip".
hint: To abort and get back to the state before "git rebase", run "git rebase --abort".
Could not apply d1e4f2c3fd... temp change to downstream torch_xla branch

Raised by https://github.com/pytorch/pytorch/actions/runs/2872404324

@miladm
Copy link
Collaborator Author

miladm commented Aug 18, 2022

@pytorchbot merge

@pytorchmergebot
Copy link
Collaborator

@pytorchbot successfully started a merge job. Check the current status here.
The merge job was triggered without a flag. This means that your change will be merged once all checks on your PR have passed (ETA: 0-4 Hours). If this is not the intended behavior, feel free to use some of the other merge options in the wiki.
Please reach out to the PyTorch DevX Team with feedback or questions!

@pytorchmergebot
Copy link
Collaborator

Merge failed
Reason: Refusing to merge as mandatory check(s) Lint failed for rule superuser
Raised by https://github.com/pytorch/pytorch/actions/runs/2880271056

@miladm
Copy link
Collaborator Author

miladm commented Aug 18, 2022

@pytorchbot merge

@pytorchmergebot
Copy link
Collaborator

@pytorchbot successfully started a merge job. Check the current status here.
The merge job was triggered without a flag. This means that your change will be merged once all checks on your PR have passed (ETA: 0-4 Hours). If this is not the intended behavior, feel free to use some of the other merge options in the wiki.
Please reach out to the PyTorch DevX Team with feedback or questions!

@pytorchmergebot
Copy link
Collaborator

Merge failed
Reason: Command git -C /home/runner/work/pytorch/pytorch merge --squash __pull-request-83415__init__ returned non-zero exit code 1

Auto-merging .github/ci_commit_pins/xla.txt
CONFLICT (content): Merge conflict in .github/ci_commit_pins/xla.txt
Squash commit -- not updating HEAD
Automatic merge failed; fix conflicts and then commit the result.

Raised by https://github.com/pytorch/pytorch/actions/runs/2883916542

@miladm
Copy link
Collaborator Author

miladm commented Aug 18, 2022

@pytorchbot rebase

@pytorchmergebot
Copy link
Collaborator

@pytorchbot successfully started a rebase job. Check the current status here

@pytorchmergebot
Copy link
Collaborator

Rebase failed due to Command git -C /home/runner/work/pytorch/pytorch rebase refs/remotes/origin/master pull/83415/head returned non-zero exit code 1

Rebasing (1/3)
Rebasing (2/3)
Auto-merging .github/ci_commit_pins/xla.txt
CONFLICT (content): Merge conflict in .github/ci_commit_pins/xla.txt
error: could not apply d1e4f2c3fd... temp change to downstream torch_xla branch
hint: Resolve all conflicts manually, mark them as resolved with
hint: "git add/rm <conflicted_files>", then run "git rebase --continue".
hint: You can instead skip this commit: run "git rebase --skip".
hint: To abort and get back to the state before "git rebase", run "git rebase --abort".
Could not apply d1e4f2c3fd... temp change to downstream torch_xla branch

Raised by https://github.com/pytorch/pytorch/actions/runs/2883996473

@miladm miladm force-pushed the update_is_dynamic_api branch from a4a4d5d to 24f11a3 Compare August 18, 2022 17:09
@miladm
Copy link
Collaborator Author

miladm commented Aug 18, 2022

@pytorchbot merge

@pytorchmergebot
Copy link
Collaborator

@pytorchbot successfully started a merge job. Check the current status here.
The merge job was triggered without a flag. This means that your change will be merged once all checks on your PR have passed (ETA: 0-4 Hours). If this is not the intended behavior, feel free to use some of the other merge options in the wiki.
Please reach out to the PyTorch DevX Team with feedback or questions!

@github-actions
Copy link
Contributor

Hey @miladm.
You've committed this PR, but it does not have both a 'release notes: ...' and 'topics: ...' label. Please add one of each to the PR. The 'release notes: ...' label should represent the part of PyTorch that this PR changes (fx, autograd, distributed, etc) and the 'topics: ...' label should represent the kind of PR it is (not user facing, new feature, bug fix, perf improvement, etc). The list of valid labels can be found here for the 'release notes: ...' and here for the 'topics: ...'.
For changes that are 'topic: not user facing' there is no need for a release notes label.

facebook-github-bot pushed a commit that referenced this pull request Aug 19, 2022
Summary:
Downstream #pytorch/xla#3888

Pull Request resolved: #83415
Approved by: https://github.com/Krovatkin

Test Plan: contbuild & OSS CI, see https://hud.pytorch.org/commit/pytorch/pytorch/72963bbae9b7f2a4f2e7c5fc84abdaa2f3552e73

Reviewed By: atalman

Differential Revision: D38852542

fbshipit-source-id: 14ddaf06b2cafa716403355e19521cd689e6cd0d
@ezyang
Copy link
Contributor

ezyang commented Aug 22, 2022

It doesn't seem like pytorchbot has successfully updated the pin since this landed

@janeyx99
Copy link
Contributor

@clee2000 looks like the hash update can't handle branch names yet/the git show does not parse the date correctly https://github.com/pytorch/pytorch/runs/7946853704?check_suite_focus=true

@clee2000
Copy link
Contributor

should be fixed with #83865 and hash update from last night also when through #83899

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

cla signed lazy Lazy Tensor work items Merged open source triaged This issue has been looked at a team member, and triaged and prioritized into an appropriate module

Projects

None yet

Development

Successfully merging this pull request may close these issues.

10 participants