-
Notifications
You must be signed in to change notification settings - Fork 25.6k
Back out "Temporary fix for remote gpu execution issue" #63983
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Conversation
Summary: Test for fixes in D30545351. it should resolve the remote execution flag being populated incorrectly issue. Test Plan: CI Differential Revision: D30549443 fbshipit-source-id: 95abad314691851cddf8d2c1092579e8bb658091
CI Flow Status⚛️ CI FlowRuleset - Version:
You can add a comment to the PR and tag @pytorchbot with the following commands: # ciflow rerun, "ciflow/default" will always be added automatically
@pytorchbot ciflow rerun
# ciflow rerun with additional labels "-l <ciflow/label_name>", which is equivalent to adding these labels manually and trigger the rerun
@pytorchbot ciflow rerun -l ciflow/scheduled -l ciflow/slow For more information, please take a look at the CI Flow Wiki. |
🔗 Helpful links
💊 CI failures summary and remediationsAs of commit 570cddb (more details on the Dr. CI page): ✅ None of the CI failures appear to be your fault 💚
❄️ 1 failure tentatively classified as flakybut reruns have not yet been triggered to confirm:
|
This pull request was exported from Phabricator. Differential Revision: D30549443 |
This pull request has been merged in 52ebe7e. |
Summary: Test for fixes in D30545351. it should resolve the remote execution flag being populated incorrectly issue.
Test Plan: CI
Differential Revision: D30549443