New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Add Half support for cummax, cummin, cumprod, logcumsumexp, and prod on CPU #112132
Conversation
🔗 Helpful Links🧪 See artifacts and rendered test results at hud.pytorch.org/pr/112132
Note: Links to docs will display an error until the docs builds have been completed. ✅ You can merge normally! (2 Unrelated Failures)As of commit bccbab0 with merge base 64f3260 (): FLAKY - The following job failed but was likely due to flakiness present on trunk:
BROKEN TRUNK - The following job failed but was present on the merge base:👉 Rebase onto the `viable/strict` branch to avoid these failures
This comment was automatically generated by Dr. CI and updates every 15 minutes. |
1cab21a
to
d40bf38
Compare
@pytorchbot merge |
Merge startedYour change will be merged once all checks pass (ETA 0-4 Hours). Learn more about merging in the wiki. Questions? Feedback? Please reach out to the PyTorch DevX Team |
Merge failedReason: Command
Details for Dev Infra teamRaised by workflow job |
@CaoE - The topic for these changes should be "improvements". They every much are user facing since they extend operator coverage to a new dtype on a device. Please update the other PRs as well. |
@cpuhrsch Do other PRs also require release note: nn ? |
@CaoE - I'm not sure. We could also use "release notes: intel". Seems appropriate? |
@cpuhrsch I added a new label "release notes: half" for half support. |
@CaoE - Please use an existing label and delete the one you created. It won't work with our release notes setup. I think |
@cpuhrsch Removed & deleted |
@pytorchbot rebase |
@pytorchbot started a rebase job onto refs/remotes/origin/viable/strict. Check the current status here |
Successfully rebased |
8634f63
to
1849115
Compare
@pytrchbot merge |
@pytorchbot rebase |
@pytorchbot started a rebase job onto refs/remotes/origin/viable/strict. Check the current status here |
Successfully rebased |
1849115
to
e03f02d
Compare
@pytorchbot merge |
Merge startedYour change will be merged once all checks pass (ETA 0-4 Hours). Learn more about merging in the wiki. Questions? Feedback? Please reach out to the PyTorch DevX Team |
…on CPU (pytorch#112132) Add Half support for cummax, cummin, cumprod, logcumsumexp, and prod on CPU. Pull Request resolved: pytorch#112132 Approved by: https://github.com/cpuhrsch
…on CPU (pytorch#112132) Add Half support for cummax, cummin, cumprod, logcumsumexp, and prod on CPU. Pull Request resolved: pytorch#112132 Approved by: https://github.com/cpuhrsch
Add Half support for cummax, cummin, cumprod, logcumsumexp, and prod on CPU.
cc @jgong5 @mingfeima @XiaobingSuper @sanchitintel @ashokei @jingxu10