Skip to content

Commit

Permalink
[Decomposition] clamp_min (#108717)
Browse files Browse the repository at this point in the history
Summary:
bypass-github-pytorch-ci-checks


Decomp already exists so just add it to core_aten_decompositions

https://www.internalfb.com/code/fbsource/[abda43a5a268e83fef6d62b49531a390ce915ad2]/fbcode/caffe2/torch/_refs/__init__.py?lines=1846

Test Plan: Phabricator + OSS Tests

Reviewed By: SS-JIA

Differential Revision: D48880080
  • Loading branch information
salilsdesai authored and facebook-github-bot committed Sep 17, 2023
1 parent 75b954b commit ed05c11
Show file tree
Hide file tree
Showing 3 changed files with 2 additions and 4 deletions.
4 changes: 0 additions & 4 deletions test/expect/HasDecompTest.test_aten_core_operators.expect
Original file line number Diff line number Diff line change
Expand Up @@ -139,10 +139,6 @@ aten::clamp_max.Tensor_out
aten::clamp_max.out
aten::clamp_max_
aten::clamp_max_.Tensor
aten::clamp_min
aten::clamp_min.Tensor
aten::clamp_min.Tensor_out
aten::clamp_min.out
aten::clamp_min_
aten::clamp_min_.Tensor
aten::clone
Expand Down
1 change: 1 addition & 0 deletions torch/_decomp/__init__.py
Original file line number Diff line number Diff line change
Expand Up @@ -227,6 +227,7 @@ def core_aten_decompositions() -> Dict[torch._ops.OperatorBase, Callable]:
aten.binary_cross_entropy_with_logits,
aten.celu,
aten.celu_,
aten.clamp_min,
aten.col2im,
aten.count_nonzero,
aten.cudnn_batch_norm,
Expand Down
1 change: 1 addition & 0 deletions torch/_inductor/decomposition.py
Original file line number Diff line number Diff line change
Expand Up @@ -71,6 +71,7 @@
decomps_to_exclude = [
aten._unsafe_index,
aten._scaled_dot_product_flash_attention.default, # See comments in torch/_decomp/decompositions.py
aten.clamp_min,
]

remove_decompositions(decompositions, decomps_to_exclude)
Expand Down

0 comments on commit ed05c11

Please sign in to comment.