-
Notifications
You must be signed in to change notification settings - Fork 25.7k
[PyTorch Edge] Remove Original Weight Tensor from QNNPack Sparse Quantized Linear Packed Params #80473
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Conversation
…tized Linear Packed Params We plan to add serialization/deserialization wihout the original weight tensor, so we no longer need to store it Differential Revision: [D34617321](https://our.internmc.facebook.com/intern/diff/D34617321/) [ghstack-poisoned]
🔗 Helpful links
✅ No Failures (0 Pending)As of commit dd5d077 (more details on the Dr. CI page): Expand to see more💚 💚 Looks good so far! There are no failures yet. 💚 💚 This comment was automatically generated by Dr. CI (expand for details).Please report bugs/suggestions to the (internal) Dr. CI Users group. |
…Sparse Quantized Linear Packed Params" We plan to add serialization/deserialization wihout the original weight tensor, so we no longer need to store it Differential Revision: [D34617321](https://our.internmc.facebook.com/intern/diff/D34617321/) [ghstack-poisoned]
…Sparse Quantized Linear Packed Params" We plan to add serialization/deserialization wihout the original weight tensor, so we no longer need to store it Differential Revision: [D34617321](https://our.internmc.facebook.com/intern/diff/D34617321/) [ghstack-poisoned]
…Sparse Quantized Linear Packed Params" We plan to add serialization/deserialization wihout the original weight tensor, so we no longer need to store it Differential Revision: [D34617321](https://our.internmc.facebook.com/intern/diff/D34617321/) [ghstack-poisoned]
…Sparse Quantized Linear Packed Params" We plan to add serialization/deserialization wihout the original weight tensor, so we no longer need to store it Differential Revision: [D34617321](https://our.internmc.facebook.com/intern/diff/D34617321/) [ghstack-poisoned]
|
@pytorchbot merge (Initiating merge automatically since Phabricator Diff has merged) |
|
@pytorchbot successfully started a merge job. Check the current status here |
|
Hey @salilsdesai. |
…tized Linear Packed Params (#80473) Summary: Pull Request resolved: #80473 We plan to add serialization/deserialization wihout the original weight tensor, so we no longer need to store it ghstack-source-id: 160768056 Test Plan: Phabricator tests Reviewed By: kimishpatel, junesg Differential Revision: D34617321 fbshipit-source-id: c3d14fba887bc705f2e10dd903ca64f239cdac03
Stack from ghstack (oldest at bottom):
We plan to add serialization/deserialization wihout the original weight tensor, so we no longer need to store it
Differential Revision: D34617321