Skip to content

Conversation

DrJessop
Copy link
Contributor

@DrJessop DrJessop commented Oct 8, 2025

Summary: Revert D84020397: [Cadence ops] Group-quantized embedding op

Differential Revision: D84186522

Summary: Revert D84020397: [Cadence ops] Group-quantized embedding op

Differential Revision: D84186522
@pytorch-bot pytorch-bot bot added the ci-no-td label Oct 8, 2025
Copy link

pytorch-bot bot commented Oct 8, 2025

🔗 Helpful Links

🧪 See artifacts and rendered test results at hud.pytorch.org/pr/pytorch/executorch/14915

Note: Links to docs will display an error until the docs builds have been completed.

This comment was automatically generated by Dr. CI and updates every 15 minutes.

@meta-cla meta-cla bot added the CLA Signed This label is managed by the Facebook bot. Authors need to sign the CLA before a PR can be reviewed. label Oct 8, 2025
Copy link

meta-codesync bot commented Oct 8, 2025

@DrJessop has exported this pull request. If you are a Meta employee, you can view the originating Diff in D84186522.

Copy link

github-actions bot commented Oct 8, 2025

This PR needs a release notes: label

If your change should be included in the release notes (i.e. would users of this library care about this change?), please use a label starting with release notes:. This helps us keep track and include your important work in the next release notes.

To add a label, you can comment to pytorchbot, for example
@pytorchbot label "release notes: none"

For more information, see
https://github.com/pytorch/pytorch/wiki/PyTorch-AutoLabel-Bot#why-categorize-for-release-notes-and-how-does-it-work.

@DrJessop DrJessop merged commit f64c864 into pytorch:main Oct 8, 2025
132 of 157 checks passed
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

ci-no-td CLA Signed This label is managed by the Facebook bot. Authors need to sign the CLA before a PR can be reviewed. fb-exported meta-exported

Projects

None yet

Development

Successfully merging this pull request may close these issues.

2 participants