Skip to content

[Arm] Support INT16 MUL ops with TOSA reference model run #13947

@Ninja91

Description

@Ninja91

With #13795, we add INT16 mul op support in quantization and add tests for TOSA, U55 & U85 which as expected fails. This task is to build support through the entire toolstack.

Remove the xfail here: https://github.com/pytorch/executorch/blob/main/backends/arm/test/ops/test_mul.py#L343-L348 and U85 and run the tests to see it failing at vela compilation.

=================================== FAILURES =================================== ___ test_mul_tensor_16a8w_u55_INT16[op_mul_rank4_randn_mutltiple_broadcasts] ___

`
E ValueError: Expected tensor tosa_transpose_default_1 in aten.repeat.default to have one of the following dtypes: ['INT8', 'INT32', 'FP32'], got: INT16

`

cc: @per @digantdesai @gggekov @3l1

Metadata

Metadata

Assignees

Labels

No labels
No labels

Type

Projects

Status

Ready

Milestone

Relationships

None yet

Development

No branches or pull requests

Issue actions