Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[microNPU][ETHOSU] Channel pad offloaded to NPU #14765

Merged
merged 4 commits into from
May 19, 2023

Conversation

sergio-grovety
Copy link
Contributor

A separate channel-dimension nn.pad relay operator is rewritten as Relay concatenate operation.

@tvm-bot
Copy link
Collaborator

tvm-bot commented May 4, 2023

Thanks for contributing to TVM! Please refer to the contributing guidelines https://tvm.apache.org/docs/contribute/ for useful information and tips. Please request code reviews from Reviewers by @-ing them in a comment.

Generated by tvm-bot

@sergio-grovety
Copy link
Contributor Author

sergio-grovety commented May 4, 2023

python/tvm/relay/op/contrib/ethosu.py Outdated Show resolved Hide resolved
return None
if (
list(pad_width[0]) != [0, 0]
or list(pad_width[1]) != [0, 0]
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Are there networks that have paddings in height, width and channels? If there are such, then it would be possible to remove width and height restrictions and add width and height padding processing to the legalization using depthwise convolution as it is done for pad2d.

Copy link
Contributor

@arina-grovety arina-grovety May 5, 2023

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Yes, you are right, spatial and channel pad can of course occur in neural networks. This is a separate task, we discussed that it will be useful in the future. We plan to solve it when we have time or a network with such pad appears.

@sergio-grovety
Copy link
Contributor Author

@tvm-bot rerun

Copy link
Contributor

@ekalda ekalda left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Thanks @sergio-grovety! Left some comments...

@@ -1447,6 +1447,84 @@ def callback(
)


class ChannelPadRewriter(DFPatternCallback):
"""Convert ethos-u.pad2d composite function to the Relay concatenate operation"""
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Suggested change
"""Convert ethos-u.pad2d composite function to the Relay concatenate operation"""
"""Convert ethos-u.channel-pad composite function to the Relay concatenate operation"""

params.ifm.tensor = post.args[0]

concat_args = list()
# Activations requiring LUT is currently not supported, so setting it to an empty list
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I know every operator here has this copy pasted legacy comment, but let's remove it... Firstly LUT based activations are supported and secondly it could leave an impression that implementing something like fused pad + sigmoid is a TODO.

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Got it, thank you.

Comment on lines 1524 to 1525
axis = 3
return relay.op.concatenate(relay.Tuple(concat_args), axis=axis)
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Since it is not used elsewhere, maybe just

Suggested change
axis = 3
return relay.op.concatenate(relay.Tuple(concat_args), axis=axis)
return relay.op.concatenate(relay.Tuple(concat_args), axis=3)


def are_pad_on_graph(self, subgraph) -> bool:
"""
This function recursively visits the graph and checks if 'nn.pad' op is ongraph
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Nit:

Suggested change
This function recursively visits the graph and checks if 'nn.pad' op is ongraph
This function recursively visits the graph and checks if 'nn.pad' op is on graph

Comment on lines +569 to +575
mod = partition_ethosu_by_table(mod, conv2d_pattern_table)

mod["tvmgen_default_ethos_u_main_0"] = dataflow_pattern.rewrite(
legalize.Conv2DRewriter(), mod["tvmgen_default_ethos_u_main_0"]
)

verify(mod["tvmgen_default_ethos_u_main_0"])
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I'm a bit confused about what that test does... It creates a TFLite graph with channel pad and conv2d, then partitions them for microNPU, then legalizes the conv2d into ethosu.conv2d and then checks that the Relay pad is still there?

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Hi @ekalda, thank you for the review!
Yes, it is. Here we check that the pad by channel does not merge with conv2d, as it happens with the spatial pad.

@pytest.mark.parametrize("ifm_shape", [(1, 55, 55, 3), (1, 23, 32, 7)])
@pytest.mark.parametrize("channel_padding", [(0, 1), (1, 1), (5, 2)])
@pytest.mark.parametrize("const_value", [0, 5, 125, -5])
def test_tflite_separate_channel_padding_legalize(ifm_shape, channel_padding, const_value):
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Shouldn't this test be using ChannelPadRewriter and then check in verify that the concatenates got created?

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Thanks for the comment, the test really did not correspond to the tested functionality. I corrected the test.

@sergio-grovety
Copy link
Contributor Author

@tvm-bot rerun

Copy link
Contributor

@ekalda ekalda left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

LGTM 🚀

Copy link
Contributor

@Aleksei-grovety Aleksei-grovety left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

LGTM!

@ekalda ekalda merged commit bbfe481 into apache:main May 19, 2023
@ekalda
Copy link
Contributor

ekalda commented May 19, 2023

Thanks @arina-grovety, @sergio-grovety and @Aleksei-grovety, this is now merged!

@sergio-grovety sergio-grovety deleted the ethosu-channel-pad branch May 20, 2023 07:10
mei-ye pushed a commit to mei-ye/tvm that referenced this pull request Jun 1, 2023
A separate channel-dimension nn.pad relay operator is rewritten as Relay concatenate operation.

---------

Co-authored-by: Sergey Smirnov <89378719+sergey-grovety@users.noreply.github.com>
Co-authored-by: arina.naumova <naumova@grovety.com>
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

None yet

5 participants