Skip to content

Commit

Permalink
Add activation functions (ReLU and SiLU for now) for structured spars…
Browse files Browse the repository at this point in the history
…e linear operator

ghstack-source-id: dbccf81feeae952f008d29b90ff12b3e2fa617ca
Pull Request resolved: #101339
  • Loading branch information
alexsamardzic committed May 13, 2023
1 parent 317a791 commit 93e30f5
Show file tree
Hide file tree
Showing 3 changed files with 370 additions and 118 deletions.
2 changes: 1 addition & 1 deletion aten/src/ATen/native/native_functions.yaml
Original file line number Diff line number Diff line change
Expand Up @@ -3191,7 +3191,7 @@
MkldnnCPU: mkldnn_linear_backward
autogen: mkldnn_linear_backward.out

- func: _structured_sparse_linear(Tensor input, Tensor weight, Tensor mask_or_meta, *, Tensor? bias=None) -> (Tensor, Tensor)
- func: _structured_sparse_linear(Tensor input, Tensor weight, Tensor mask_or_meta, *, Tensor? bias=None, str? activation=None) -> (Tensor, Tensor)
dispatch:
CUDA: _structured_sparse_linear

Expand Down

0 comments on commit 93e30f5

Please sign in to comment.