Skip to content

[Bug] ORT inference outputs None value #25373

Open
@gcunhase

Description

@gcunhase

Describe the issue

Adding an additional consumer to TopK causes output to be set as None.

Notice that the only difference in the model with the issue is that it has an additional consumer of the output tensor /TopK_5_output_1 (/Div_17_output_0):
Image

To reproduce

ONNX files: onnx_models.zip

Python script:

import numpy as np
import onnx
import onnxruntime as ort

onnx_path = "far3d_opset17_ISSUE_smaller_ort.onnx"
# onnx_path = "far3d_opset17_OKAY_smaller_ort.onnx"
providers = ["TensorrtExecutionProvider", "CPUExecutionProvider"]


def load_dummy_data(input_info):  # 100):
    data = {}
    for inp_name, inp_val in input_info.items():
        data[inp_name] = np.random.rand(*inp_val["shape"]).astype(inp_val["dtype"])
    return data


def onnx_type_str_to_enum(dtype: str) -> int:
    """Converts ONNX type in string format to onnx.TensorProto format."""
    dtype = dtype.split("tensor(")[-1].split(")")[0]
    dtype = "FLOAT" if dtype == "float32" else dtype.upper()
    return getattr(onnx.TensorProto, dtype)


session_opts = ort.SessionOptions()
session_opts.log_severity_level = 3  # just errors
sess = ort.InferenceSession(onnx_path, sess_options=session_opts, providers=providers)

input_info = {}
for inp in sess.get_inputs():
    inp_shape = [1 if isinstance(s, str) else s for s in inp.shape]
    input_info[inp.name] = {
        "shape": inp_shape,
        "dtype": onnx.helper.tensor_dtype_to_np_dtype(onnx_type_str_to_enum(inp.type))
    }

feed_dict = load_dummy_data(input_info=input_info)
inference_outputs = sess.run(None, feed_dict)

output_info = [out for out in sess.get_outputs()]
outputs = {}
has_none_out = False
for out_info, inf_out in zip(output_info, inference_outputs):
    if inf_out is None:
        has_none_out = True
        print(f"Output '{out_info.name}' = None!")

if not has_none_out:
    print("All outputs are valid!")

Running the script with far3d_opset17_ISSUE_smaller_ort.onnx:

Output 'scores' = None!

Running the script with far3d_opset17_OKAY_smaller_ort.onnx:

All outputs are valid!

The expectation is that both models output the All outputs are valid! message.

Urgency

Urgent, blocking updates in https://github.com/NVIDIA/TensorRT-Model-Optimizer

Platform

Linux

OS Version

Ubuntu 22.04

ONNX Runtime Installation

Released Package

ONNX Runtime Version or Commit ID

1.22.0

ONNX Runtime API

Python

Architecture

X64

Execution Provider

TensorRT

Execution Provider Library Version

No response

Metadata

Metadata

Assignees

No one assigned

    Labels

    No labels
    No labels

    Type

    No type

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions