Skip to content

Conversation

laithsakka
Copy link
Contributor

@laithsakka laithsakka commented Mar 8, 2025

Stack from ghstack (oldest at bottom):

This PR remove the usage of guard_size_oblivious in vector_norm by inlining it in the runtime check,
this prevent any data dependent error from ever appearing here at the locations where guard_size_oblivious
used to exist. Before this PR it used to break potentially. This is NOT BC breaking or changing of semantics from eager.

cc @voznesenskym @penguinwu @EikanWang @jgong5 @Guobing-Chen @XiaobingSuper @zhuhaozhe @blzheng @wenzhe-nrv @jiayisunx @chenyang78 @kadeng @chauhang @amjames

Copy link

pytorch-bot bot commented Mar 8, 2025

🔗 Helpful Links

🧪 See artifacts and rendered test results at hud.pytorch.org/pr/148809

Note: Links to docs will display an error until the docs builds have been completed.

❌ 6 New Failures, 2 Unrelated Failures

As of commit e13a5f0 with merge base 6841451 (image):

NEW FAILURES - The following jobs have failed:

BROKEN TRUNK - The following jobs failed but were present on the merge base:

👉 Rebase onto the `viable/strict` branch to avoid these failures

This comment was automatically generated by Dr. CI and updates every 15 minutes.

@laithsakka laithsakka changed the title remove gaurd_size_oblivious from vector_norm [DRAFT] guard_size_oblivious from vector_norm Mar 8, 2025
@laithsakka laithsakka changed the title [DRAFT] guard_size_oblivious from vector_norm [DRAFT] remove guard_size_oblivious from vector_norm Mar 8, 2025
jurgen-paul pushed a commit to jurgen-paul/pytorch.git.file that referenced this pull request Mar 19, 2025
@laithsakka laithsakka changed the title [DRAFT] remove guard_size_oblivious from vector_norm remove guard_size_oblivious from vector_norm Mar 25, 2025
@laithsakka laithsakka changed the title remove guard_size_oblivious from vector_norm Remove guard_size_oblivious from vector_norm Mar 25, 2025
@laithsakka laithsakka requested review from aorenste, avikchaudhuri, bdhirsh, bobrenjc93, ezyang and pianpwk and removed request for ezyang March 25, 2025 07:59
# When the whole expression is passed to evaluation in that case, we do not throw a
# data dependent error or guard because we can statically know the result is True
# before unpacking the symbols.
sym_or = operator.or_
Copy link
Contributor

@bobrenjc93 bobrenjc93 Apr 4, 2025

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

given that @pianpwk is introducing sym_or in #150456 (comment), does it make sense to rebase on top of his PR?

Copy link
Contributor Author

@laithsakka laithsakka Apr 4, 2025

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

he can rebase on top of mine and remove this maybe
i put this PR long ago and want to land it . @PiaN do you mind ?

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

also maybe add this comment on your PR on the new function once you rebase @pianpwk

@laithsakka
Copy link
Contributor Author

@pytorchbot merge

@pytorch-bot pytorch-bot bot added the ciflow/trunk Trigger trunk jobs on your pull request label Apr 4, 2025
This PR remove the usage of guard_size_oblivious in vector_norm by inlining it in the runtime check, 
this prevent any data dependent error from ever appearing here at the locations where guard_size_oblivious
used to exist. Before this PR it used to break potentially. This is NOT BC breaking or changing of semantics from eager. 

cc voznesenskym penguinwu EikanWang jgong5 Guobing-Chen XiaobingSuper zhuhaozhe blzheng wenzhe-nrv jiayisunx chenyang78 kadeng chauhang amjames

[ghstack-poisoned]
laithsakka added a commit that referenced this pull request Apr 4, 2025
ghstack-source-id: b8c2472
Pull Request resolved: #148809
@pytorchmergebot
Copy link
Collaborator

Merge started

Your change will be merged once all checks pass (ETA 0-4 Hours).

Learn more about merging in the wiki.

Questions? Feedback? Please reach out to the PyTorch DevX Team

Advanced Debugging
Check the merge workflow status
here

@laithsakka
Copy link
Contributor Author

@pytorchbot merge

@pytorchmergebot
Copy link
Collaborator

The merge job was canceled or timed out. This most often happen if two merge requests were issued for the same PR, or if merge job was waiting for more than 6 hours for tests to finish. In later case, please do not hesitate to reissue the merge command
For more information see pytorch-bot wiki.

@pytorchmergebot
Copy link
Collaborator

Merge started

Your change will be merged once all checks pass (ETA 0-4 Hours).

Learn more about merging in the wiki.

Questions? Feedback? Please reach out to the PyTorch DevX Team

Advanced Debugging
Check the merge workflow status
here

@pytorchmergebot
Copy link
Collaborator

@laithsakka
Copy link
Contributor Author

@pytorchbot merge

@pytorchmergebot
Copy link
Collaborator

Merge started

Your change will be merged once all checks pass (ETA 0-4 Hours).

Learn more about merging in the wiki.

Questions? Feedback? Please reach out to the PyTorch DevX Team

Advanced Debugging
Check the merge workflow status
here

@pytorchmergebot
Copy link
Collaborator

Merge failed

Reason: 1 mandatory check(s) failed. The first few are:

Dig deeper by viewing the failures on hud

Details for Dev Infra team Raised by workflow job

Failing merge rule: Core Maintainers

@laithsakka
Copy link
Contributor Author

@pytorchbot merge -i

timocafe pushed a commit to timocafe/pytorch that referenced this pull request Apr 16, 2025
…#148809)

This PR remove the usage of guard_size_oblivious in vector_norm by inlining it in the runtime check,
this prevent any data dependent error from ever appearing here at the locations where guard_size_oblivious
used to exist. Before this PR it used to break potentially. This is NOT BC breaking or changing of semantics from eager.

Pull Request resolved: pytorch#148809
Approved by: https://github.com/bobrenjc93
amathewc pushed a commit to amathewc/pytorch that referenced this pull request Apr 17, 2025
…#148809)

This PR remove the usage of guard_size_oblivious in vector_norm by inlining it in the runtime check,
this prevent any data dependent error from ever appearing here at the locations where guard_size_oblivious
used to exist. Before this PR it used to break potentially. This is NOT BC breaking or changing of semantics from eager.

Pull Request resolved: pytorch#148809
Approved by: https://github.com/bobrenjc93
Divigroup-RAP pushed a commit to Divigroup-RAP/PYTORCH that referenced this pull request Apr 22, 2025
@github-actions github-actions bot deleted the gh/laithsakka/115/head branch May 15, 2025 02:16
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

Projects

None yet

Development

Successfully merging this pull request may close these issues.

5 participants