Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We鈥檒l occasionally send you account related emails.

Already on GitHub? Sign in to your account

Hardshrink for Sparse Tensors #16956

Open
randolf-scholz opened this issue Feb 11, 2019 · 2 comments
Open

Hardshrink for Sparse Tensors #16956

randolf-scholz opened this issue Feb 11, 2019 · 2 comments
Assignees
Labels
feature A request for a proper, new feature. module: sparse Related to torch.sparse triaged This issue has been looked at a team member, and triaged and prioritized into an appropriate module

Comments

@randolf-scholz
Copy link
Contributor

randolf-scholz commented Feb 11, 2019

馃殌 Feature

RuntimeError: hardshrink is not implemented for type torch.sparse.FloatTensor

Motivation

We are experimenting with sparse neural networks. We want to model the connection between neurons with sparse matrices. At times, if the weights are too small we want to prune connections, i.e. set the element in the sparse matrix to zero via a hardshrink.

cc @vincentqb

@randolf-scholz
Copy link
Contributor Author

OK so apparantly we can manually apply functions by modifying _values in place. However there is the problem that when values are set to 0, they remain 0 even after calling coalesce. Seems like the only way around this at the moment is to manually reconstruct the tensor, or even fuglier: calling .to_dense().to_sparse() to restore sparsity.

@fmassa fmassa added feature A request for a proper, new feature. module: sparse Related to torch.sparse labels Feb 19, 2019
@aocsa aocsa self-assigned this Jul 1, 2020
@pbelevich pbelevich added the triaged This issue has been looked at a team member, and triaged and prioritized into an appropriate module label Jul 2, 2020
@aocsa
Copy link
Contributor

aocsa commented Aug 27, 2020

Heads up: PR #41410 implements Hardshrink for sparse tensors. Autograd support is included. Feedback is welcome.

aocsa added a commit to Quansight/pytorch that referenced this issue Mar 17, 2021
@pearu pearu added this to To do in Sparse tensors Aug 10, 2021
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
feature A request for a proper, new feature. module: sparse Related to torch.sparse triaged This issue has been looked at a team member, and triaged and prioritized into an appropriate module
Projects
Development

Successfully merging a pull request may close this issue.

4 participants