Skip to content

TF2 SavedModel to ONNX conversion fails for ConvLSTM2D layer #1222

@rttus

Description

@rttus

Describe the bug
I am trying to convert a TF2 SavedModel to ONNX using

python3 -m tf2onnx.convert --saved-model <model_dir_path> --opset 13 --output <output_file_name>

The conversion fails with the below stack trace

WARNING - ONNX Failed to infer shapes and dtypes for [functional_5/functional_3/conv_lst_m2d/while/TensorArrayV2Read/TensorListGetItem__17, type: Unsqueeze]
Traceback (most recent call last):
File "/.local/lib/python3.6/site-packages/tf2onnx/schemas.py", line 157, in infer_onnx_shape_dtype
inferred_model = shape_inference.infer_shapes(model_proto)
File "
/.local/lib/python3.6/site-packages/onnx/shape_inference.py", line 35, in infer_shapes
inferred_model_str = C.infer_shapes(model_str, check_type)
RuntimeError: input 1 is out of bounds

ERROR - Failed to convert node 'functional_5/functional_3/conv_lst_m2d_2/while/BiasAdd_3' (fct=<bound method ConvOp.version_11 of <class 'tf2onnx.onnx_opset.nn.ConvOp'>>)
'OP=Conv\nName=functional_5/functional_3/conv_lst_m2d_2/while/BiasAdd_3\nInputs:\n\tfunctional_5/functional_3/conv_lst_m2d_2/while/TensorArrayV2Read/TensorListGetItem__72:0=Squeeze, None, 1\n\tfunctional_5/functional_3/conv_lst_m2d_2/while/split:3=Split, [1, 1, 128, 10], 1\n\tfunctional_5/functional_3/conv_lst_m2d_2/while/split_2:3=Split, [10], 1\nOutpus:\n\tfunctional_5/functional_3/conv_lst_m2d_2/while/BiasAdd_3:0=[1, 11, 20, 10], 1'
Traceback (most recent call last):
File "/.local/lib/python3.6/site-packages/tf2onnx/tfonnx.py", line 287, in tensorflow_onnx_mapping
func(g, node, **kwargs)
File "
/.local/lib/python3.6/site-packages/tf2onnx/onnx_opset/nn.py", line 367, in version_11
cls.version_1(ctx, node, **kwargs)
File "~/.local/lib/python3.6/site-packages/tf2onnx/onnx_opset/nn.py", line 353, in version_1
if len(input_shape) == spatial + 1:
TypeError: object of type 'NoneType' has no len()

Urgency
None

System information

  • OS Platform and Distribution (e.g., Linux Ubuntu 16.04): Ubuntu 18.04.5 LTS
  • Tensorflow Version: 2.3.0
  • ONNX Version: 1.8.0
  • tf2onnx Version: 1.7.2/995bd6
  • Python version: 3.6

To Reproduce
I will not be able to share the model or weights.

Expected behavior
Generate a ONNX file in the location given by --output param

Screenshots
If applicable, add screenshots to help explain your problem.

Additional context
Add any other context about the problem here. If the issue is about a particular model, please share the model details as well to facilitate debugging.

Metadata

Metadata

Labels

No labels
No labels

Type

No type

Projects

No projects

Milestone

No milestone

Relationships

None yet

Development

No branches or pull requests

Issue actions