Skip to content

Conversation

mcr229
Copy link
Contributor

@mcr229 mcr229 commented Aug 25, 2023

Summary:
In order to support MV3 which has decomposed hardswish and hardsigmoid

Decomp rules for both:

Hardswish

https://www.internalfb.com/code/fbsource/[9368f8417bd843ee8c91e24ac616ed7f4b194ed8]/xplat/caffe2/torch/_decomp/decompositions.py?lines=182-185

Hardsigmoid

https://www.internalfb.com/code/fbsource/[9368f8417bd843ee8c91e24ac616ed7f4b194ed8]/xplat/caffe2/torch/_decomp/decompositions.py?lines=159-162

Fixing Zero-Dim tensors

Both of these decompositions produce zero-dim tensors in the graph ( The + 3 and the / 6). This breaks for XNNPACK because it does not have zero-dim tensors. Instead if the static data is zero dim, then we will interpret it as [1].

Fixing torch.int64 static data

In the decomposition 3 is converted via to_copy(torch.float32). However 6 remains as an int64. XNNPACK does not handle non-quantized integers, so we also cast all static data that is not quantized to float32 values.

Reviewed By: digantdesai

Differential Revision: D48667679

@facebook-github-bot facebook-github-bot added the CLA Signed This label is managed by the Facebook bot. Authors need to sign the CLA before a PR can be reviewed. label Aug 25, 2023
@facebook-github-bot
Copy link
Contributor

This pull request was exported from Phabricator. Differential Revision: D48667679

Summary:
Pull Request resolved: pytorch/executorch#140

In order to support MV3 which has decomposed hardswish and hardsigmoid

Decomp rules for both:
### Hardswish
https://www.internalfb.com/code/fbsource/[9368f8417bd843ee8c91e24ac616ed7f4b194ed8]/xplat/caffe2/torch/_decomp/decompositions.py?lines=182-185

### Hardsigmoid
https://www.internalfb.com/code/fbsource/[9368f8417bd843ee8c91e24ac616ed7f4b194ed8]/xplat/caffe2/torch/_decomp/decompositions.py?lines=159-162

### Fixing Zero-Dim tensors
Both of these decompositions produce zero-dim tensors in the graph ( The + 3 and the / 6). This breaks for XNNPACK because it does not have zero-dim tensors. Instead if the static data is zero dim, then we will interpret it as [1].

#### Fixing torch.int64 static data
In the decomposition 3 is converted via to_copy(torch.float32). However 6 remains as an int64. XNNPACK does not handle non-quantized integers, so we also cast all static data that is not quantized to float32 values.

Reviewed By: digantdesai

Differential Revision: D48667679

fbshipit-source-id: 5edc3d881a599b0f1ee9fc6fddbc582db2f729ee
@facebook-github-bot
Copy link
Contributor

This pull request was exported from Phabricator. Differential Revision: D48667679

@facebook-github-bot
Copy link
Contributor

This pull request has been merged in 80ed456.

Gasoonjia pushed a commit that referenced this pull request Jul 30, 2024
* --chat

* update
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

CLA Signed This label is managed by the Facebook bot. Authors need to sign the CLA before a PR can be reviewed. fb-exported Merged

Projects

None yet

Development

Successfully merging this pull request may close these issues.

2 participants