Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[dynamo] Add guards for deterministic algos #96695

Closed
wants to merge 2 commits into from

Conversation

colesbury
Copy link
Member

@colesbury colesbury commented Mar 13, 2023

Inductor now falls back to eager mode for deterministic algos. Add guards in dynamo to check if the deterministic algos mode changes.

See #93537

cc @soumith @voznesenskym @penguinwu @anijain2305 @EikanWang @jgong5 @Guobing-Chen @XiaobingSuper @zhuhaozhe @blzheng @Xia-Weiwen @wenzhe-nrv @jiayisunx @peterbell10 @desertfire

@pytorch-bot
Copy link

pytorch-bot bot commented Mar 13, 2023

🔗 Helpful Links

🧪 See artifacts and rendered test results at hud.pytorch.org/pr/96695

Note: Links to docs will display an error until the docs builds have been completed.

✅ No Failures

As of commit 4d784f5:
💚 Looks good so far! There are no failures yet. 💚

This comment was automatically generated by Dr. CI and updates every 15 minutes.

@colesbury colesbury changed the title [inductor] Fall back to eager mode for deterministic algos. [dynamo] Add guards for deterministic algos Mar 30, 2023
@colesbury colesbury force-pushed the issue93537 branch 2 times, most recently from ef07a06 to 71c2849 Compare March 30, 2023 17:22
Inductor now falls back to eager mode for deterministic algos. Add
guards in dynamo to check if the deterministic algos mode changes.

See pytorch#93537
@colesbury colesbury marked this pull request as ready for review March 30, 2023 18:34
@colesbury colesbury requested review from jansel and ngimel March 30, 2023 18:34
@colesbury
Copy link
Member Author

@ngimel can you review this when you get a chance?

r1 = fn(idx, values)
for _ in range(10):
rn = fn(idx, values)
assert (r1 == rn).all()
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

should it be self.assertEqual with 0 tolerance?

Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I'll update it.

Is there a functional difference? (I originally used assert because that's what was used by the nearby test cases.)

Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

No functional difference, TestCase.assertEqual provides more detailed error message when it doesn't pass (maximum difference, index of the maximum difference etc).

@colesbury
Copy link
Member Author

@pytorchbot merge

@pytorch-bot pytorch-bot bot added the ciflow/trunk Trigger trunk jobs on your pull request label Mar 31, 2023
@pytorchmergebot
Copy link
Collaborator

Merge failed

Reason: This PR needs a label
If your changes are user facing and intended to be a part of release notes, please use a label starting with release notes:.

If not, please add the topic: not user facing label.

To add a label, you can comment to pytorchbot, for example
@pytorchbot label "topic: not user facing"

For more information, see
https://github.com/pytorch/pytorch/wiki/PyTorch-AutoLabel-Bot#why-categorize-for-release-notes-and-how-does-it-work.

Details for Dev Infra team Raised by workflow job

@colesbury
Copy link
Member Author

@pytorchbot merge

@pytorchmergebot
Copy link
Collaborator

Merge started

Your change will be merged once all checks pass (ETA 0-4 Hours).

Learn more about merging in the wiki.

Questions? Feedback? Please reach out to the PyTorch DevX Team

Advanced Debugging
Check the merge workflow status
here

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Development

Successfully merging this pull request may close these issues.

None yet

4 participants