Skip to content

Shape inference on modules returns ? #415

@denolf

Description

@denolf

Hi,

Even for a module with just a single con2d layer, shape inference returns a set of question marks:
func @forward(%arg0: !torch.vtensor<[1,1,64,64],f32>) -> !torch.vtensor<[?,?,?,?],f32>

while @annotate_args was used to specify the size of the input tensor.

Is there a pass available that can resolve this? Currently the pipelined torchscript-module-to-torch-backend-pipeline was used.

The python code for the small test is attached.

Additionally, what is the decomposition of the torchscript-module-to-torch-backend-pipeline? Using symbol-dce,torch-prepare-for-globalize-object-graph,torch-globalize-object-graph,symbol-dce,inline,torch-adjust-calling-conventions,torch-inline-global-slots does not produce the same output.

Thanks

Kristof
cnv1TorchMLIR.py.txt

Metadata

Metadata

Assignees

No one assigned

    Labels

    No labels
    No labels

    Type

    No type

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions