-
Notifications
You must be signed in to change notification settings - Fork 21.7k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Refactor make_fx to better support hop subgraph tracing #125267
Conversation
[ghstack-poisoned]
🔗 Helpful Links🧪 See artifacts and rendered test results at hud.pytorch.org/pr/125267
Note: Links to docs will display an error until the docs builds have been completed. ❌ 8 New Failures, 2 Unrelated FailuresAs of commit 2f6e5d7 with merge base b522e65 (): NEW FAILURES - The following jobs have failed:
FLAKY - The following jobs failed but were likely due to flakiness present on trunk:
This comment was automatically generated by Dr. CI and updates every 15 minutes. |
ghstack-source-id: 31c1a4b4f8edad489e5d8e17862d194e1a05e951 Pull Request resolved: #125267
[ghstack-poisoned]
ghstack-source-id: 29cdd64041ecdd97056b75d49ccf625dd45d30ea Pull Request resolved: #125267
[ghstack-poisoned]
ghstack-source-id: d6244d2808300faa00e4c74d9795c829d473d6b8 Pull Request resolved: #125267
[ghstack-poisoned]
ghstack-source-id: d96625ccfd8cf522b0ca10319dc83e70a31f81fc Pull Request resolved: #125267
[ghstack-poisoned]
ghstack-source-id: f6d4b1d1304bb4d314793fc5bfacc66e9e8abb4d Pull Request resolved: #125267
This is to replace #122972. We extract the states of make_fx out into a _MakefxTracer class. This allows us to create a new make_fx_tracer when tracing subgraphs. The entry point for tracings subgraphas is 'reenter_make_fx(fn)'. Test Plan: Existing tests and remove a related task on torch function. [ghstack-poisoned]
Code movement + minor rewrites. We extract the states of make_fx out into a _MakefxTracer class. This allows us to create a new make_fx_tracer when tracing subgraphs. The entry point for tracings subgraphas is 'reenter_make_fx(fn)'. Test Plan: Existing tests and remove a related task on torch function. [ghstack-poisoned]
Code movement + minor rewrites. We extract the states of make_fx out into a _MakefxTracer class. This allows us to create a new make_fx_tracer when tracing subgraphs. The entry point for tracings subgraphas is 'reenter_make_fx(fn)'. Test Plan: Existing tests and remove a related task on torch function. [ghstack-poisoned]
|
||
# All context managers and their states should be initialized before tracing based on the inputs | ||
# and configurations. | ||
# After tracing, thier states should be cleaned except for shape_env. |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
shape_env is leaked out because of t.shape_env = self.fake_tensor_mode.shape_env
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Can we get some tests of this? Or are you relying purely on the existing test suite?
yeah, this pr is supposed to be a code movement only change. Existing tests should suffice. Any suspicious places you want to test? |
Ok looks fine to me. |
|
||
# All context managers and their states should be initialized before tracing based on the inputs | ||
# and configurations. | ||
# After tracing, thier states should be cleaned except for shape_env. |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
their not thier
Code movement + minor rewrites. We extract the states of make_fx out and encapsulate them into a _MakefxTracer class. This allows us to create a new make_fx_tracer when tracing subgraphs, the actual logic for tracing subgraph is in the next diff. Test Plan: Existing tests. [ghstack-poisoned]
Code movement + minor rewrites. We extract the states of make_fx out and encapsulate them into a _MakefxTracer class. This allows us to create a new make_fx_tracer when tracing subgraphs, the actual logic for tracing subgraph is in the next diff. Test Plan: Existing tests. [ghstack-poisoned]
Code movement + minor rewrites. We extract the states of make_fx out and encapsulate them into a _MakefxTracer class. This allows us to create a new make_fx_tracer when tracing subgraphs, the actual logic for tracing subgraph is in the next diff. Test Plan: Existing tests. [ghstack-poisoned]
Code movement + minor rewrites. We extract the states of make_fx out and encapsulate them into a _MakefxTracer class. This allows us to create a new make_fx_tracer when tracing subgraphs, the actual logic for tracing subgraph is in the next diff. Test Plan: Existing tests. [ghstack-poisoned]
Adds trace_subgraph to _MakefxTracer, the motivation is in #122972. Also migrate all existing usage of reenter_make_fx to the new sub-tracer. Previously, the torch function mode for creating torch_fn metadata won't be re-enetered when we're in ProxyTensorMode (since it's inside of __torch_function__). This PR reconstruct the torch function mode based on parent tracer's config and reentered the torch function mode so the metadata is shown in the graph. **Test Plan:** Existing tests. We have a bunch of make_fx tests for cond, map and while_loop. Also remove expected failure for torch_fn since reenter_make_fx is able to re-construct torch function modes. Also fixes #124643 Pull Request resolved: #125363 Approved by: https://github.com/Chillee ghstack dependencies: #125267
Code movement + minor rewrites. We extract the states of make_fx out and encapsulate them into a _MakefxTracer class. This allows us to create a new make_fx_tracer when tracing subgraphs, the actual logic for tracing subgraph is in the next diff. Test Plan: Existing tests. Pull Request resolved: pytorch#125267 Approved by: https://github.com/Chillee
Adds trace_subgraph to _MakefxTracer, the motivation is in pytorch#122972. Also migrate all existing usage of reenter_make_fx to the new sub-tracer. Previously, the torch function mode for creating torch_fn metadata won't be re-enetered when we're in ProxyTensorMode (since it's inside of __torch_function__). This PR reconstruct the torch function mode based on parent tracer's config and reentered the torch function mode so the metadata is shown in the graph. **Test Plan:** Existing tests. We have a bunch of make_fx tests for cond, map and while_loop. Also remove expected failure for torch_fn since reenter_make_fx is able to re-construct torch function modes. Also fixes pytorch#124643 Pull Request resolved: pytorch#125363 Approved by: https://github.com/Chillee ghstack dependencies: pytorch#125267
Stack from ghstack (oldest at bottom):
Code movement + minor rewrites. We extract the states of make_fx out and encapsulate them into a _MakefxTracer class. This allows us to create a new make_fx_tracer when tracing subgraphs, the actual logic for tracing subgraph is in the next diff.
Test Plan:
Existing tests.