Skip to content

Conversation

agrima1304
Copy link
Collaborator

@agrima1304 agrima1304 commented Aug 26, 2025

Decomposes elu into other operators/ lookup table for FP and INT cases.

cc @digantdesai @freddan80 @per @zingo @oscarandersson8218

Copy link

pytorch-bot bot commented Aug 26, 2025

🔗 Helpful Links

🧪 See artifacts and rendered test results at hud.pytorch.org/pr/pytorch/executorch/13683

Note: Links to docs will display an error until the docs builds have been completed.

❗ 1 Active SEVs

There are 1 currently active SEVs. If your PR is affected, please view them below:

❌ 1 New Failure, 1 Cancelled Job, 9 Unrelated Failures

As of commit c3e6386 with merge base deaf37f (image):

NEW FAILURE - The following job has failed:

CANCELLED JOB - The following job was cancelled. Please retry:

FLAKY - The following job failed but was likely due to flakiness present on trunk:

BROKEN TRUNK - The following jobs failed but was present on the merge base:

👉 Rebase onto the `viable/strict` branch to avoid these failures

This comment was automatically generated by Dr. CI and updates every 15 minutes.

@meta-cla meta-cla bot added the CLA Signed This label is managed by the Facebook bot. Authors need to sign the CLA before a PR can be reviewed. label Aug 26, 2025
@agrima1304
Copy link
Collaborator Author

@pytorchbot label "partner: arm"

@pytorch-bot pytorch-bot bot added the partner: arm For backend delegation, kernels, demo, etc. from the 3rd-party partner, Arm label Aug 26, 2025
@agrima1304
Copy link
Collaborator Author

@pytorchbot label "release notes: arm"

@pytorch-bot pytorch-bot bot added the release notes: arm Changes to the ARM backend delegate label Aug 26, 2025
@agrima1304
Copy link
Collaborator Author

@pytorchbot label "ciflow/trunk"

Copy link

pytorch-bot bot commented Aug 26, 2025

To add these label(s) (ciflow/trunk) to the PR, please first approve the workflows that are awaiting approval (scroll to the bottom of this page).

This helps ensure we don't trigger CI on this PR until it is actually authorized to do so. Please ping one of the reviewers if you do not have access to approve and run workflows.

Signed-off-by: Agrima Khare <agrima.khare@arm.com>

Change-Id: I032414e7454d5e2cada05b788e9eed0f7b2dc97c
@pytorch-bot pytorch-bot bot removed the ciflow/trunk label Sep 1, 2025
@zingo zingo changed the title Arm Backend: Add support for ELU.default operator Arm backend: Add support for ELU.default operator Sep 1, 2025
@zingo zingo merged commit a42423c into pytorch:main Sep 2, 2025
246 of 258 checks passed
@shoumikhin
Copy link
Contributor

@agrima1304 please check is the following Pyre type checks are resolved:

    [executorch/backends/nxp/aten_passes/neutron_aten_pass_manager.py:28:23] Incompatible variable type [9]: passes is declared to have type `List[Type[typing.Callable[[GraphModule], PassResult]]]` but is used as type `None`.
    [executorch/backends/nxp/aten_passes/neutron_aten_pass_manager.py:29:8] Incompatible variable type [9]: passes is declared to have type `List[Type[typing.Callable[[GraphModule], PassResult]]]` but is used as type `Union[List[Type[typing.Callable[[GraphModule], PassResult]]], List[Union[FuseBatchNormWithConvPass, FuseBatchNormWithLinearPass, SplitGroupConvolution]]]`.
    [executorch/backends/nxp/aten_passes/neutron_aten_pass_manager.py:35:25] Incompatible parameter type [6]: In call `PassManager.__init__`, for 1st positional argument, expected `Union[None, List[typing.Callable[[GraphModule], Optional[PassResult]]], List[List[typing.Callable[[GraphModule], Optional[PassResult]]]]]` but got `List[Type[typing.Callable[[GraphModule], PassResult]]]`.
    [executorch/backends/nxp/aten_passes/split_group_convolution.py:45:4] Uninitialized attribute [13]: Attribute `module` is declared in class `SplitGroupConvolution` to have type `torch.fx.graph_module.GraphModule` but is never initialized.
    [executorch/backends/nxp/aten_passes/split_group_convolution.py:174:47] Incompatible parameter type [6]: In call `torch.fx.graph.Graph.inserting_after`, for 1st positional argument, expected `Optional[node.Node]` but got `_C.Node`.
    [executorch/backends/nxp/aten_passes/split_group_convolution.py:182:8] Incompatible return type [7]: Expected `_C.Node` but got `node.Node`.
    [executorch/backends/nxp/aten_passes/split_group_convolution.py:184:4] Inconsistent override [14]: `executorch.backends.nxp.aten_passes.split_group_convolution.SplitGroupConvolution.call` overrides method defined in `PassBase` inconsistently. Could not find parameter `graph_module` in overriding signature.
    [executorch/backends/nxp/aten_passes/split_group_convolution.py:240:52] Incompatible parameter type [6]: In call `SplitGroupConvolution._create_parameter_node_for_data`, for 3rd positional argument, expected `_C.Node` but got `node.Node`.
    [executorch/backends/nxp/aten_passes/split_group_convolution.py:246:50] Incompatible parameter type [6]: In call `SplitGroupConvolution._create_parameter_node_for_data`, for 3rd positional argument, expected `_C.Node` but got `node.Node`.
    [executorch/backends/nxp/aten_passes/split_group_convolution.py:254:20] Incompatible parameter type [6]: In call `SplitGroupConvolution._get_topologically_last_node`, for 1st positional argument, expected `List[node.Node]` but got `List[Union[_C.Node, node.Node]]`.
    [executorch/backends/nxp/tests/models.py:87:12] Incompatible parameter type [6]: In call `torch.nn.modules.conv.Conv3d.__init__`, for argument `kernel_size`, expected `Union[Tuple[int, int, int], int]` but got `Union[Tuple[int, int], int]`.
    [executorch/backends/nxp/tests/models.py:88:12] Incompatible parameter type [6]: In call `torch.nn.modules.conv.Conv3d.__init__`, for argument `stride`, expected `Union[Tuple[int, int, int], int]` but got `Union[Tuple[int, int], int]`.
    [executorch/backends/nxp/tests/models.py:89:12] Incompatible parameter type [6]: In call `torch.nn.modules.conv.Conv3d.__init__`, for argument `padding`, expected `Union[Tuple[int, int, int], int, str]` but got `Union[Collection[int], int, str]`.
    [executorch/backends/nxp/tests/models.py:90:12] Incompatible parameter type [6]: In call `torch.nn.modules.conv.Conv3d.__init__`, for argument `dilation`, expected `Union[Tuple[int, int, int], int]` but got `Union[Tuple[int, int], int]`.

@zingo
Copy link
Collaborator

zingo commented Sep 2, 2025

Hi @shoumikhin we look into it when back to work but first just double checking did you post this in the right PR?
The files you listed are is in the NXP driver, and this PR only changed Arm backend code, that switch checking to mypy and not pyre. But maybe the NXP code is using/calling passes/code from the arm backend?

@shoumikhin
Copy link
Contributor

@zingo indeed, they are related to a different PR, sorry!

@zingo
Copy link
Collaborator

zingo commented Sep 2, 2025

Thanks for checking 🙏

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

ciflow/trunk CLA Signed This label is managed by the Facebook bot. Authors need to sign the CLA before a PR can be reviewed. partner: arm For backend delegation, kernels, demo, etc. from the 3rd-party partner, Arm release notes: arm Changes to the ARM backend delegate

Projects

None yet

Development

Successfully merging this pull request may close these issues.

4 participants