Skip to content

Conversation

@titaiwangms
Copy link
Contributor

No description provided.

@justinchuby
Copy link
Collaborator

justinchuby commented Feb 15, 2023

We should also remove the cast in masked_fill and add a note there.

@titaiwangms titaiwangms marked this pull request as ready for review February 15, 2023 17:36
@codecov
Copy link

codecov bot commented Feb 15, 2023

Codecov Report

Merging #439 (6e8f0c8) into main (a7dde7b) will increase coverage by 0.00%.
The diff coverage is 87.50%.

@@           Coverage Diff           @@
##             main     #439   +/-   ##
=======================================
  Coverage   71.27%   71.27%           
=======================================
  Files         108      108           
  Lines       10475    10480    +5     
  Branches     1085     1085           
=======================================
+ Hits         7466     7470    +4     
- Misses       2699     2700    +1     
  Partials      310      310           
Impacted Files Coverage Δ
onnxscript/function_libs/torch_aten/ops/core.py 67.47% <87.50%> (+0.03%) ⬆️

Help us with your feedback. Take ten seconds to tell us how you rate us. Have a feature suggestion? Share it here.

@torch_op("aten::bitwise_not", overload=True)
def aten_bitwise_not_bool(self: BOOL) -> BOOL:
# bitwise_not(Tensor self) -> Tensor
# FIXME(titaiwang): This is a hack to get around the fact that we don't have op.BitwiseNot supporting bool now.
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

We will have to go with this I think. Not a hack as we need a dispatcher to handle different dtypes

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Need if for aten_bitwise_not?

Copy link
Collaborator

@justinchuby justinchuby Feb 15, 2023

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Oh we cannot do if because the dtypes are different, and we don’t know the dtype within a function

Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

So even trace only won’t work for us here

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

So how should we know the dtype beforehand?

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Maybe there are only certain kinds of ops can be input of BitWise operations that we can do the processing in those ops?

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Guess we can forget it...

@_onnx_symbolic("aten::bitwise_not")
@_beartype.beartype
def bitwise_not(g: jit_utils.GraphContext, input):
    if not symbolic_helper._is_bool(input):
        raise errors.SymbolicValueError(
            "ONNX export does NOT support exporting bitwise Not "
            "for non-boolean input values",
            input,
        )
    return g.op("Not", input)

Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

So how should we know the dtype beforehand?

the exporter knows about dtypes so it can choose which function to use



@torch_op("aten::masked_fill")
def aten_masked_fill(self: TTensor, mask: BOOL, value: TTensor) -> TTensor:
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Could you create another PR for the non-hacks we identified, including changes to mask fill and bitwise not (both versions) so that we can merge into main? The rest can stay here for reference but we are probably not going to merge?

@titaiwangms
Copy link
Contributor Author

titaiwangms commented Feb 15, 2023

Replace by #442
This branch is specifically fixing type promotion in bloom.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

2 participants