Skip to content

Conversation

@salilsdesai
Copy link
Contributor

@salilsdesai salilsdesai commented Jun 28, 2022

…tized Linear Packed Params

We plan to add serialization/deserialization wihout the original weight tensor, so we no longer need to store it

Differential Revision: [D34617321](https://our.internmc.facebook.com/intern/diff/D34617321/)

[ghstack-poisoned]
@facebook-github-bot
Copy link
Contributor

facebook-github-bot commented Jun 28, 2022

🔗 Helpful links

✅ No Failures (0 Pending)

As of commit dd5d077 (more details on the Dr. CI page):

Expand to see more

💚 💚 Looks good so far! There are no failures yet. 💚 💚


This comment was automatically generated by Dr. CI (expand for details).

Please report bugs/suggestions to the (internal) Dr. CI Users group.

Click here to manually regenerate this comment.

…Sparse Quantized Linear Packed Params"

We plan to add serialization/deserialization wihout the original weight tensor, so we no longer need to store it

Differential Revision: [D34617321](https://our.internmc.facebook.com/intern/diff/D34617321/)

[ghstack-poisoned]
…Sparse Quantized Linear Packed Params"

We plan to add serialization/deserialization wihout the original weight tensor, so we no longer need to store it

Differential Revision: [D34617321](https://our.internmc.facebook.com/intern/diff/D34617321/)

[ghstack-poisoned]
…Sparse Quantized Linear Packed Params"

We plan to add serialization/deserialization wihout the original weight tensor, so we no longer need to store it

Differential Revision: [D34617321](https://our.internmc.facebook.com/intern/diff/D34617321/)

[ghstack-poisoned]
…Sparse Quantized Linear Packed Params"

We plan to add serialization/deserialization wihout the original weight tensor, so we no longer need to store it

Differential Revision: [D34617321](https://our.internmc.facebook.com/intern/diff/D34617321/)

[ghstack-poisoned]
@facebook-github-bot
Copy link
Contributor

@pytorchbot merge

(Initiating merge automatically since Phabricator Diff has merged)

@pytorchmergebot
Copy link
Collaborator

@pytorchbot successfully started a merge job. Check the current status here

@github-actions
Copy link
Contributor

github-actions bot commented Jul 7, 2022

Hey @salilsdesai.
You've committed this PR, but it does not have both a 'release notes: ...' and 'topics: ...' label. Please add one of each to the PR. The 'release notes: ...' label should represent the part of PyTorch that this PR changes (fx, autograd, distributed, etc) and the 'topics: ...' label should represent the kind of PR it is (not user facing, new feature, bug fix, perf improvement, etc). The list of valid labels can be found here for the 'release notes: ...' and here for the 'topics: ...'.
For changes that are 'topic: not user facing' there is no need for a release notes label.

facebook-github-bot pushed a commit that referenced this pull request Jul 7, 2022
…tized Linear Packed Params (#80473)

Summary:
Pull Request resolved: #80473

We plan to add serialization/deserialization wihout the original weight tensor, so we no longer need to store it
ghstack-source-id: 160768056

Test Plan: Phabricator tests

Reviewed By: kimishpatel, junesg

Differential Revision: D34617321

fbshipit-source-id: c3d14fba887bc705f2e10dd903ca64f239cdac03
@facebook-github-bot facebook-github-bot deleted the gh/salilsdesai/10/head branch July 11, 2022 14:17
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Projects

None yet

Development

Successfully merging this pull request may close these issues.

4 participants