You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
The tensor.collapse_shape does not touch the collapsed dimensions, so if the reshape were propagated through the tensor.pad op, then the two ops could fuse into a dispatch.
When trying to fuse
tensor.pad
with producers, reshapes can be blocking fusion unnecessarily. The following IR is an example of this from VAE:The
tensor.collapse_shape
does not touch the collapsed dimensions, so if the reshape were propagated through thetensor.pad
op, then the two ops could fuse into a dispatch.One way to do this would be to add reshape propagation patterns for tensor.pad (like https://github.com/llvm/llvm-project/blob/af31883341a122a7285e9b4f0a034470024021eb/mlir/lib/Dialect/Linalg/Transforms/ElementwiseOpFusion.cpp#L922), but it may be tricky to manage the propagations.
The text was updated successfully, but these errors were encountered: