-
Notifications
You must be signed in to change notification settings - Fork 21.5k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
re-export torch.optim._multi_tensor in torch/__init__.py #129095
base: main
Are you sure you want to change the base?
Conversation
🔗 Helpful Links🧪 See artifacts and rendered test results at hud.pytorch.org/pr/129095
Note: Links to docs will display an error until the docs builds have been completed. ❌ 1 New Failure, 2 Unrelated FailuresAs of commit 4b6c259 with merge base d8db074 (): NEW FAILURE - The following job has failed:
FLAKY - The following job failed but was likely due to flakiness present on trunk:
UNSTABLE - The following job failed but was likely due to flakiness present on trunk and has been marked as unstable:
This comment was automatically generated by Dr. CI and updates every 15 minutes. |
This pull request was exported from Phabricator. Differential Revision: D58792889 |
) Summary: Pull Request resolved: pytorch#129095 - PR pytorch#127703 introduced a circular dependency `torch/optim/__init__.py` imports `torch.optim._multi_tensor` and `torch.optim._multi_tensor/_init__.py` imports `torch.optim` This seemed to work fine (green signals everywhere) but caused some internal test failures after it landed; an infinite recursion during import. - PR pytorch#128875 attempted to fix this by removing the import from `torch/optim/__init__.py`. This seemed to work fine: green signals everywhere and the failing tests started passing but a smaller number of tests started failing; unable to import `torch.optim._multi_tensor` - This diff re-introduces the import but after `torch.optim` is fully initialized Test Plan: CI signals Differential Revision: D58792889
d4fe3fc
to
aaaff5f
Compare
This pull request was exported from Phabricator. Differential Revision: D58792889 |
1 similar comment
This pull request was exported from Phabricator. Differential Revision: D58792889 |
aaaff5f
to
fbcd85c
Compare
) Summary: Pull Request resolved: pytorch#129095 - PR pytorch#127703 introduced a circular dependency `torch/optim/__init__.py` imports `torch.optim._multi_tensor` and `torch.optim._multi_tensor/_init__.py` imports `torch.optim` This seemed to work fine (green signals everywhere) but caused some internal test failures after it landed; an infinite recursion during import. - PR pytorch#128875 attempted to fix this by removing the import from `torch/optim/__init__.py`. This seemed to work fine: green signals everywhere and the failing tests started passing but a smaller number of tests started failing; unable to import `torch.optim._multi_tensor` - This diff re-introduces the import but after `torch.optim` is fully initialized Test Plan: CI signals Differential Revision: D58792889
) Summary: Pull Request resolved: pytorch#129095 - PR pytorch#127703 introduced a circular dependency `torch/optim/__init__.py` imports `torch.optim._multi_tensor` and `torch.optim._multi_tensor/_init__.py` imports `torch.optim` This seemed to work fine (green signals everywhere) but caused some internal test failures after it landed; an infinite recursion during import. - PR pytorch#128875 attempted to fix this by removing the import from `torch/optim/__init__.py`. This seemed to work fine: green signals everywhere and the failing tests started passing but a smaller number of tests started failing; unable to import `torch.optim._multi_tensor` - This diff re-introduces the import but after `torch.optim` is fully initialized Test Plan: CI signals Differential Revision: D58792889
This pull request was exported from Phabricator. Differential Revision: D58792889 |
fbcd85c
to
4b6c259
Compare
@@ -1987,6 +1987,7 @@ def _assert(condition, message): | |||
utils as utils, | |||
xpu as xpu, | |||
) | |||
import torch.optim._multi_tensor # usort: skip |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
May need to add a comment here:
import torch.optim._multi_tensor # usort: skip | |
# needs to be before import torch.optim as optim to avoid circular dependencies | |
import torch.optim._multi_tensor # usort: skip |
Summary:
PR [BE][Easy] export explicitly imported public submodules #127703 introduced a circular dependency
torch/optim/__init__.py
importstorch.optim._multi_tensor
andtorch.optim._multi_tensor/_init__.py
importstorch.optim
This seemed to work fine (green signals everywhere) but caused some internal test failures after it landed; an infinite recursion during import.
PR Remove circular import #128875 attempted to fix this by removing the import from
torch/optim/__init__.py
.This seemed to work fine: green signals everywhere and the failing tests started passing but a smaller number of tests started failing; unable to import
torch.optim._multi_tensor
torch.optim
is fully initializedTest Plan: CI signals
Differential Revision: D58792889