Skip to content

Conversation

@wconstab
Copy link
Contributor

@wconstab wconstab commented Nov 22, 2022

- Avoid fx graph rewrite that replaces certain ops with ones using
  triton random
- Keep track of replacement ops using triton random, so it is possible
  to not disable all replacements when using fallback_random

[ghstack-poisoned]
@pytorch-bot
Copy link

pytorch-bot bot commented Nov 22, 2022

🔗 Helpful Links

🧪 See artifacts and rendered test results at hud.pytorch.org/pr/89515

Note: Links to docs will display an error until the docs builds have been completed.

✅ No Failures

As of commit 18cee96:
💚 Looks good so far! There are no failures yet. 💚

This comment was automatically generated by Dr. CI and updates every 15 minutes.



def replace_fx(gm: torch.fx.GraphModule):
# Sometimes patch_functions() misses things already in the graph
Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

What is AutogradMonkeypatch used for? do i also need to update it? not obvious what it means that 'replacements' "is" the function being called on torch_function

(see above, L27, not in this diff)

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

That looks like a typo to me. I think it should be: func in replacements.

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

ok makes sense. i'll PR that too but separately in case it breaks something :)

@wconstab wconstab added the topic: not user facing topic category label Nov 22, 2022
@wconstab wconstab requested a review from ngimel November 22, 2022 20:15
@wconstab
Copy link
Contributor Author

@pytorchbot merge

@pytorch-bot pytorch-bot bot added the ciflow/trunk Trigger trunk jobs on your pull request label Nov 22, 2022
@pytorchmergebot
Copy link
Collaborator

Merge started

Your change will be merged once all checks pass (ETA 0-4 Hours).

Learn more about merging in the wiki.

Questions? Feedback? Please reach out to the PyTorch DevX Team

Advanced Debugging
Check the merge workflow status
here

kulinseth pushed a commit to kulinseth/pytorch that referenced this pull request Dec 10, 2022
- Avoid fx graph rewrite that replaces certain ops with ones using
  triton random
- Keep track of replacement ops using triton random, so it is possible
  to not disable all replacements when using fallback_random

Pull Request resolved: pytorch#89515
Approved by: https://github.com/ngimel
@facebook-github-bot facebook-github-bot deleted the gh/wconstab/47/head branch June 8, 2023 19:17
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

Projects

None yet

Development

Successfully merging this pull request may close these issues.

4 participants