Skip to content

support for cadence platform QAT (#18746)#18746

Merged
meta-codesync[bot] merged 1 commit intopytorch:mainfrom
rezaasjd:export-D99712539
Apr 16, 2026
Merged

support for cadence platform QAT (#18746)#18746
meta-codesync[bot] merged 1 commit intopytorch:mainfrom
rezaasjd:export-D99712539

Conversation

@rezaasjd
Copy link
Copy Markdown
Contributor

@rezaasjd rezaasjd commented Apr 7, 2026

Summary:

Adds support for QAT variant of the cadence quantizer with fake quantizer as wrappers

Reviewed By: mcremon-meta

Differential Revision: D99712539

@pytorch-bot
Copy link
Copy Markdown

pytorch-bot bot commented Apr 7, 2026

🔗 Helpful Links

🧪 See artifacts and rendered test results at hud.pytorch.org/pr/pytorch/executorch/18746

Note: Links to docs will display an error until the docs builds have been completed.

✅ You can merge normally! (2 Unrelated Failures)

As of commit 6bca914 with merge base 28c56fe (image):

BROKEN TRUNK - The following jobs failed but were present on the merge base:

👉 Rebase onto the `viable/strict` branch to avoid these failures

This comment was automatically generated by Dr. CI and updates every 15 minutes.

@meta-cla meta-cla bot added the CLA Signed This label is managed by the Facebook bot. Authors need to sign the CLA before a PR can be reviewed. label Apr 7, 2026
@meta-codesync
Copy link
Copy Markdown
Contributor

meta-codesync bot commented Apr 7, 2026

@rezaasjd has exported this pull request. If you are a Meta employee, you can view the originating Diff in D99712539.

@github-actions
Copy link
Copy Markdown

github-actions bot commented Apr 7, 2026

This PR needs a release notes: label

If your change should be included in the release notes (i.e. would users of this library care about this change?), please use a label starting with release notes:. This helps us keep track and include your important work in the next release notes.

To add a label, you can comment to pytorchbot, for example
@pytorchbot label "release notes: none"

For more information, see
https://github.com/pytorch/pytorch/wiki/PyTorch-AutoLabel-Bot#why-categorize-for-release-notes-and-how-does-it-work.

@meta-codesync meta-codesync bot changed the title support for cadence platform QAT support for cadence platform QAT (#18746) Apr 8, 2026
rezaasjd pushed a commit to rezaasjd/executorch that referenced this pull request Apr 8, 2026
Summary:

Adds support for QAT variant of the cadence quantizer with fake quantizer as wrappers

Differential Revision: D99712539
rezaasjd pushed a commit to rezaasjd/executorch that referenced this pull request Apr 15, 2026
Summary:

Adds support for QAT variant of the cadence quantizer with fake quantizer as wrappers

Reviewed By: mcremon-meta

Differential Revision: D99712539
rezaasjd pushed a commit to rezaasjd/executorch that referenced this pull request Apr 15, 2026
Summary:
Pull Request resolved: pytorch#18746

Adds support for QAT variant of the cadence quantizer with fake quantizer as wrappers

Reviewed By: mcremon-meta

Differential Revision: D99712539
@rezaasjd rezaasjd force-pushed the export-D99712539 branch 2 times, most recently from 6a6701a to 35a4f4c Compare April 15, 2026 21:46
rezaasjd pushed a commit to rezaasjd/executorch that referenced this pull request Apr 15, 2026
Summary:

Adds support for QAT variant of the cadence quantizer with fake quantizer as wrappers

Reviewed By: mcremon-meta

Differential Revision: D99712539
Summary:
Pull Request resolved: pytorch#18746

Adds support for QAT variant of the cadence quantizer with fake quantizer as wrappers

Reviewed By: mcremon-meta

Differential Revision: D99712539
@meta-codesync meta-codesync bot merged commit a43675c into pytorch:main Apr 16, 2026
161 of 165 checks passed
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

CLA Signed This label is managed by the Facebook bot. Authors need to sign the CLA before a PR can be reviewed. fb-exported meta-exported

Projects

None yet

Development

Successfully merging this pull request may close these issues.

2 participants