Skip to content
Closed
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
8 changes: 8 additions & 0 deletions python/tvm/relay/frontend/onnx.py
Original file line number Diff line number Diff line change
Expand Up @@ -2099,6 +2099,14 @@ def expand_shape(in_shape, shape):
new_shape = _op.maximum(in_shape, shape)
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I wonder if it would do the same thing to add fold_constant here? I don't love the use of params, but I'm not strictly opposed to it.

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

FoldConstant seems not help here, as it has been used after this function (L2110: shape = fold_constant(expand_shape(in_shape, shape))).

Meanwhile, I also agree that using parameters to avoid dynamic ops is not a good practice. From compiler's perspective, they are "parameters" because their values can be changed anytime, so we should not make any assumption here. A good practice is going through from_onnx -> bind_param_by_name -> fold_constant -> dynamic_to_static. Since parameters become constants after binding, it's now safe to say these ops can be static.

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I agree that (1) not using the param and (2) using dynamic_to_static are the right solution here. I was simply thinking about getting all static operators from this frozen model but the concept is not correct here.

I will decline this PR. Thank you for the feedback.

return new_shape

try:
# Try to use the values in the shape tensors to
# deternmine the shape
shape_tensor_name = inputs[1].name_hint
shape = _expr.const(params[shape_tensor_name])
except (AttributeError, KeyError):
pass

shape = fold_constant(expand_shape(in_shape, shape))
return _op.broadcast_to(inputs[0], shape=shape)

Expand Down