-
Notifications
You must be signed in to change notification settings - Fork 25.6k
CUDA implementation of Sparse Adagrad Fusion for GPUs #35762
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Closed
Closed
Conversation
This file contains hidden or bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
We implement the following operators for Regular and RowWise SparseAdagrad Fusion with SLS and SLWS gradient - SparseAdagradFusedWithSparseLengthsSumGradient - RowWiseSparseAdagradFusedWithSparseLengthsSumGradient - SparseAdagradFusedWithSparseLengthsWeightedSumGradient - RowWiseSparseAdagradFusedWithSparseLengthsWeightedSumGradient This will benefit for TSTAR (Zion) and TS (Training Supercomputer) project. Differential Revision: [D20453096](https://our.internmc.facebook.com/intern/diff/D20453096/) **NOTE FOR REVIEWERS**: This PR has internal Facebook specific changes or comments, please review them on [Phabricator](https://our.internmc.facebook.com/intern/diff/D20453096/)! [ghstack-poisoned]
jianyuh
added a commit
that referenced
this pull request
Mar 31, 2020
We implement the following operators for Regular and RowWise SparseAdagrad Fusion with SLS and SLWS gradient - SparseAdagradFusedWithSparseLengthsSumGradient - RowWiseSparseAdagradFusedWithSparseLengthsSumGradient - SparseAdagradFusedWithSparseLengthsWeightedSumGradient - RowWiseSparseAdagradFusedWithSparseLengthsWeightedSumGradient This will benefit for TSTAR (Zion) and TS (Training Supercomputer) project. Differential Revision: [D20453096](https://our.internmc.facebook.com/intern/diff/D20453096/) **NOTE FOR REVIEWERS**: This PR has internal Facebook specific changes or comments, please review them on [Phabricator](https://our.internmc.facebook.com/intern/diff/D20453096/)! ghstack-source-id: 101206036 Pull Request resolved: #35762
💊 Build failures summary and remediationsAs of commit 2018864 (more details on the Dr. CI page):
ci.pytorch.org: 1 failedThis comment was automatically generated by Dr. CI (expand for details).Follow this link to opt-out of these comments for your Pull Requests.Please report bugs/suggestions on the GitHub issue tracker. This comment has been revised 33 times. |
We implement the following operators for Regular and RowWise SparseAdagrad Fusion with SLS and SLWS gradient - SparseAdagradFusedWithSparseLengthsSumGradient - RowWiseSparseAdagradFusedWithSparseLengthsSumGradient - SparseAdagradFusedWithSparseLengthsWeightedSumGradient - RowWiseSparseAdagradFusedWithSparseLengthsWeightedSumGradient Differential Revision: [D20453096](https://our.internmc.facebook.com/intern/diff/D20453096/) **NOTE FOR REVIEWERS**: This PR has internal Facebook specific changes or comments, please review them on [Phabricator](https://our.internmc.facebook.com/intern/diff/D20453096/)! [ghstack-poisoned]
jianyuh
added a commit
that referenced
this pull request
Apr 4, 2020
Pull Request resolved: #35762 We implement the following operators for Regular and RowWise SparseAdagrad Fusion with SLS and SLWS gradient - SparseAdagradFusedWithSparseLengthsSumGradient - RowWiseSparseAdagradFusedWithSparseLengthsSumGradient - SparseAdagradFusedWithSparseLengthsWeightedSumGradient - RowWiseSparseAdagradFusedWithSparseLengthsWeightedSumGradient Differential Revision: [D20453096](https://our.internmc.facebook.com/intern/diff/D20453096/) **NOTE FOR REVIEWERS**: This PR has internal Facebook specific changes or comments, please review them on [Phabricator](https://our.internmc.facebook.com/intern/diff/D20453096/)! ghstack-source-id: 101536379
Closed
We implement the following operators for Regular and RowWise SparseAdagrad Fusion with SLS and SLWS gradient - SparseAdagradFusedWithSparseLengthsSumGradient - RowWiseSparseAdagradFusedWithSparseLengthsSumGradient - SparseAdagradFusedWithSparseLengthsWeightedSumGradient - RowWiseSparseAdagradFusedWithSparseLengthsWeightedSumGradient Differential Revision: [D20453096](https://our.internmc.facebook.com/intern/diff/D20453096/) **NOTE FOR REVIEWERS**: This PR has internal Facebook specific changes or comments, please review them on [Phabricator](https://our.internmc.facebook.com/intern/diff/D20453096/)! [ghstack-poisoned]
jianyuh
added a commit
that referenced
this pull request
Apr 4, 2020
Pull Request resolved: #35762 We implement the following operators for Regular and RowWise SparseAdagrad Fusion with SLS and SLWS gradient - SparseAdagradFusedWithSparseLengthsSumGradient - RowWiseSparseAdagradFusedWithSparseLengthsSumGradient - SparseAdagradFusedWithSparseLengthsWeightedSumGradient - RowWiseSparseAdagradFusedWithSparseLengthsWeightedSumGradient Differential Revision: [D20453096](https://our.internmc.facebook.com/intern/diff/D20453096/) **NOTE FOR REVIEWERS**: This PR has internal Facebook specific changes or comments, please review them on [Phabricator](https://our.internmc.facebook.com/intern/diff/D20453096/)! ghstack-source-id: 101544700
We implement the following operators for Regular and RowWise SparseAdagrad Fusion with SLS and SLWS gradient - SparseAdagradFusedWithSparseLengthsSumGradient - RowWiseSparseAdagradFusedWithSparseLengthsSumGradient - SparseAdagradFusedWithSparseLengthsWeightedSumGradient - RowWiseSparseAdagradFusedWithSparseLengthsWeightedSumGradient Differential Revision: [D20453096](https://our.internmc.facebook.com/intern/diff/D20453096/) **NOTE FOR REVIEWERS**: This PR has internal Facebook specific changes or comments, please review them on [Phabricator](https://our.internmc.facebook.com/intern/diff/D20453096/)! [ghstack-poisoned]
We implement the following operators for Regular and RowWise SparseAdagrad Fusion with SLS and SLWS gradient - SparseAdagradFusedWithSparseLengthsSumGradient - RowWiseSparseAdagradFusedWithSparseLengthsSumGradient - SparseAdagradFusedWithSparseLengthsWeightedSumGradient - RowWiseSparseAdagradFusedWithSparseLengthsWeightedSumGradient Differential Revision: [D20453096](https://our.internmc.facebook.com/intern/diff/D20453096/) **NOTE FOR REVIEWERS**: This PR has internal Facebook specific changes or comments, please review them on [Phabricator](https://our.internmc.facebook.com/intern/diff/D20453096/)! [ghstack-poisoned]
jianyuh
added a commit
that referenced
this pull request
Apr 6, 2020
Pull Request resolved: #35762 We implement the following operators for Regular and RowWise SparseAdagrad Fusion with SLS and SLWS gradient - SparseAdagradFusedWithSparseLengthsSumGradient - RowWiseSparseAdagradFusedWithSparseLengthsSumGradient - SparseAdagradFusedWithSparseLengthsWeightedSumGradient - RowWiseSparseAdagradFusedWithSparseLengthsWeightedSumGradient Differential Revision: [D20453096](https://our.internmc.facebook.com/intern/diff/D20453096/) **NOTE FOR REVIEWERS**: This PR has internal Facebook specific changes or comments, please review them on [Phabricator](https://our.internmc.facebook.com/intern/diff/D20453096/)! ghstack-source-id: 101569272
We implement the following operators for Regular and RowWise SparseAdagrad Fusion with SLS and SLWS gradient - SparseAdagradFusedWithSparseLengthsSumGradient - RowWiseSparseAdagradFusedWithSparseLengthsSumGradient - SparseAdagradFusedWithSparseLengthsWeightedSumGradient - RowWiseSparseAdagradFusedWithSparseLengthsWeightedSumGradient Differential Revision: [D20453096](https://our.internmc.facebook.com/intern/diff/D20453096/) **NOTE FOR REVIEWERS**: This PR has internal Facebook specific changes or comments, please review them on [Phabricator](https://our.internmc.facebook.com/intern/diff/D20453096/)! [ghstack-poisoned]
jianyuh
added a commit
that referenced
this pull request
Apr 7, 2020
Pull Request resolved: #35762 We implement the following operators for Regular and RowWise SparseAdagrad Fusion with SLS and SLWS gradient - SparseAdagradFusedWithSparseLengthsSumGradient - RowWiseSparseAdagradFusedWithSparseLengthsSumGradient - SparseAdagradFusedWithSparseLengthsWeightedSumGradient - RowWiseSparseAdagradFusedWithSparseLengthsWeightedSumGradient Differential Revision: [D20453096](https://our.internmc.facebook.com/intern/diff/D20453096/) **NOTE FOR REVIEWERS**: This PR has internal Facebook specific changes or comments, please review them on [Phabricator](https://our.internmc.facebook.com/intern/diff/D20453096/)! ghstack-source-id: 101708597
We implement the following operators for Regular and RowWise SparseAdagrad Fusion with SLS and SLWS gradient - SparseAdagradFusedWithSparseLengthsSumGradient - RowWiseSparseAdagradFusedWithSparseLengthsSumGradient - SparseAdagradFusedWithSparseLengthsWeightedSumGradient - RowWiseSparseAdagradFusedWithSparseLengthsWeightedSumGradient Differential Revision: [D20453096](https://our.internmc.facebook.com/intern/diff/D20453096/) **NOTE FOR REVIEWERS**: This PR has internal Facebook specific changes or comments, please review them on [Phabricator](https://our.internmc.facebook.com/intern/diff/D20453096/)! [ghstack-poisoned]
jianyuh
added a commit
that referenced
this pull request
Apr 21, 2020
Pull Request resolved: #35762 We implement the following operators for Regular and RowWise SparseAdagrad Fusion with SLS and SLWS gradient - SparseAdagradFusedWithSparseLengthsSumGradient - RowWiseSparseAdagradFusedWithSparseLengthsSumGradient - SparseAdagradFusedWithSparseLengthsWeightedSumGradient - RowWiseSparseAdagradFusedWithSparseLengthsWeightedSumGradient Differential Revision: [D20453096](https://our.internmc.facebook.com/intern/diff/D20453096/) **NOTE FOR REVIEWERS**: This PR has internal Facebook specific changes or comments, please review them on [Phabricator](https://our.internmc.facebook.com/intern/diff/D20453096/)! ghstack-source-id: 102583129
This pull request has been merged in 171476e. |
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Add this suggestion to a batch that can be applied as a single commit.
This suggestion is invalid because no changes were made to the code.
Suggestions cannot be applied while the pull request is closed.
Suggestions cannot be applied while viewing a subset of changes.
Only one suggestion per line can be applied in a batch.
Add this suggestion to a batch that can be applied as a single commit.
Applying suggestions on deleted lines is not supported.
You must change the existing code in this line in order to create a valid suggestion.
Outdated suggestions cannot be applied.
This suggestion has been applied or marked resolved.
Suggestions cannot be applied from pending reviews.
Suggestions cannot be applied on multi-line comments.
Suggestions cannot be applied while the pull request is queued to merge.
Suggestion cannot be applied right now. Please check back later.
Stack from ghstack:
We implement the following operators for Regular and RowWise SparseAdagrad Fusion with SLS and SLWS gradient
Differential Revision: D20453096
NOTE FOR REVIEWERS: This PR has internal Facebook specific changes or comments, please review them on Phabricator!