[ONNX] Use static broadcast_to for ONNX's Expand operator if possible#8522
[ONNX] Use static broadcast_to for ONNX's Expand operator if possible#8522JoeyChou-SiMa-ai wants to merge 1 commit intoapache:mainfrom
Conversation
…erator in ONNX frontend
mbrookhart
left a comment
There was a problem hiding this comment.
Can you add a unit test?
| @@ -2099,6 +2099,14 @@ def expand_shape(in_shape, shape): | |||
| new_shape = _op.maximum(in_shape, shape) | |||
There was a problem hiding this comment.
I wonder if it would do the same thing to add fold_constant here? I don't love the use of params, but I'm not strictly opposed to it.
There was a problem hiding this comment.
FoldConstant seems not help here, as it has been used after this function (L2110: shape = fold_constant(expand_shape(in_shape, shape))).
Meanwhile, I also agree that using parameters to avoid dynamic ops is not a good practice. From compiler's perspective, they are "parameters" because their values can be changed anytime, so we should not make any assumption here. A good practice is going through from_onnx -> bind_param_by_name -> fold_constant -> dynamic_to_static. Since parameters become constants after binding, it's now safe to say these ops can be static.
There was a problem hiding this comment.
I agree that (1) not using the param and (2) using dynamic_to_static are the right solution here. I was simply thinking about getting all static operators from this frozen model but the concept is not correct here.
I will decline this PR. Thank you for the feedback.
While translating ONNX's
Expandoperator, if the value in shape input tensor are known at the compile time, use it to get the output shape of Expand operator so it can create a staticbroadcast_toinstead of a dynamic one.