Skip to content

Commit

Permalink
Add activation functions (ReLU and SiLU for now) for structured spars…
Browse files Browse the repository at this point in the history
…e linear operator

ghstack-source-id: 1f4a2a1b7d7de4c4f25cbecb92ad6ac85309feeb
Pull Request resolved: #101339
  • Loading branch information
alexsamardzic committed Jun 4, 2023
1 parent 66d2638 commit 16fff8f
Show file tree
Hide file tree
Showing 3 changed files with 370 additions and 118 deletions.
2 changes: 1 addition & 1 deletion aten/src/ATen/native/native_functions.yaml
Original file line number Diff line number Diff line change
Expand Up @@ -3216,7 +3216,7 @@
MkldnnCPU: mkldnn_linear_backward
autogen: mkldnn_linear_backward.out

- func: _structured_sparse_linear(Tensor input, Tensor weight, Tensor mask_or_meta, *, Tensor? bias=None) -> (Tensor, Tensor)
- func: _structured_sparse_linear(Tensor input, Tensor weight, Tensor mask_or_meta, *, Tensor? bias=None, str? activation=None) -> (Tensor, Tensor)
dispatch:
CUDA: _structured_sparse_linear

Expand Down

0 comments on commit 16fff8f

Please sign in to comment.