Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[fx] make fx.wrap idempotent #104838

Closed
wants to merge 1 commit into from
Closed

[fx] make fx.wrap idempotent #104838

wants to merge 1 commit into from

Conversation

suo
Copy link
Member

@suo suo commented Jul 9, 2023

Previously, if you called torch.fx.wrap() on the same thing twice, it would add two entries to _wrapped_fns_to_patch. Then, when tracing, the patcher would process them both. On the second entry, the patcher would double-wrap the fn (e.g. wrap(wrap(orig_fn)))

This makes it so that wrapping is observable after the trace. While normally, a Patcher instance will "revert" the wrapping after tracing, the double wrapped function goes from wrap(wrap(orig_fn)) -> wrap(orig_fn).

This happens to work in normal fx stuff (after all, the wrapper function will behave exactly like the original function). But it upsets torch.package, which is not expecting to see a weird wrapper function in the graph.

This PR adds a dictionary to deduplicate wrap() calls, ensuring that the patcher only operates each once per frame-fn pair.

Previously, if you called `torch.fx.wrap()` on the same thing twice, it
would add two entries to `_wrapped_fns_to_patch`. Then, when tracing,
the patcher would process them both. On the second entry, the patcher
would double-wrap the fn (e.g. `wrap(wrap(orig_fn))`)

This makes it so that wrapping is observable after the trace. While
normally, a Patcher instance will "revert" the wrapping after tracing,
the double wrapped function goes from `wrap(wrap(orig_fn)) -> wrap(orig_fn)`.

This happens to work in normal fx stuff (after all, the wrapper function
will behave exactly like the original function). But it upsets
torch.package, which is not expecting to see a weird wrapper function in
the graph.
@pytorch-bot
Copy link

pytorch-bot bot commented Jul 9, 2023

🔗 Helpful Links

🧪 See artifacts and rendered test results at hud.pytorch.org/pr/104838

Note: Links to docs will display an error until the docs builds have been completed.

✅ No Failures

As of commit 9ffe96b:
💚 Looks good so far! There are no failures yet. 💚

This comment was automatically generated by Dr. CI and updates every 15 minutes.

@pytorch-bot pytorch-bot bot added the release notes: package/deploy release notes category label Jul 9, 2023
@suo suo requested review from albanD and bradleyhd July 9, 2023 17:12
@suo
Copy link
Member Author

suo commented Jul 9, 2023

@pytorchbot merge

@pytorch-bot pytorch-bot bot added the ciflow/trunk Trigger trunk jobs on your pull request label Jul 9, 2023
@pytorchmergebot
Copy link
Collaborator

Merge started

Your change will be merged once all checks pass (ETA 0-4 Hours).

Learn more about merging in the wiki.

Questions? Feedback? Please reach out to the PyTorch DevX Team

Advanced Debugging
Check the merge workflow status
here

Copy link
Member

@houseroad houseroad left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Wondering when will this be synced to internal codebase?

@suo
Copy link
Member Author

suo commented Jul 10, 2023

cc @osalpekar can you comment on the time to sync this to internal? I'm not sure the current diff train SLA.

@osalpekar
Copy link
Member

cc @osalpekar can you comment on the time to sync this to internal? I'm not sure the current diff train SLA.

Since this was merged during the weekend, it will be imported in tonight's (07/10) diff-train import and landed tomorrow (07/11).

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
ciflow/trunk Trigger trunk jobs on your pull request Merged release notes: package/deploy release notes category
Projects
None yet
Development

Successfully merging this pull request may close these issues.

None yet

5 participants