Skip to content

tensorRT9's python bindings don't work correctly #3741

@bashirmindee

Description

@bashirmindee

Description

tensorrt 9.2 and 9.3's python binding uses the wrong version of tensorrt libraries (version 8 instead of version 9).

Environment

TensorRT Version: 9.3 installed from pre-compiled binary

NVIDIA GPU: 3070

NVIDIA Driver Version: 545.23.08

CUDA Version: 12.2

CUDNN Version: 8.9.7.29

Operating System:

Python Version (if applicable): 3.8

Model link:

https://github.com/onnx/models/tree/main/validated/vision/classification/resnet
https://github.com/onnx/models/blob/main/validated/vision/classification/resnet/model/resnet18-v1-7.tar.gz

Steps To Reproduce

After downloading the precompiled tensorrt 9.3 found under branch release/9.3. I extracted the files and then installed the python bindings:

pip install TensorRT-9.3.0.1/python/tensorrt-9.3.0.post12.dev1-cp38-none-linux_x86_64.whl
export LD_LIBRARY_PATH=$LD_LIBRARY_PATH:/home/myhome/.../ TensorRT-9.3.0.1/lib

Using Resnet as example.

wget https://github.com/onnx/models/blob/main/validated/vision/classification/resnet/model/resnet18-v1-7.tar.gz
tar -xzf resnet18-v1-7.tar.gz

Afterwards I use the tensorRT backend via onnx:

import onnxruntime as rt

sess_options = rt.SessionOptions()

providers = [
    ("TensorrtExecutionProvider", {}),  # use tensorrt9 produces error
]

session = rt.InferenceSession(
    "resnet18-v1-7/resnet18-v1-7.onnx",
    sess_options,
    providers=providers,
)

The error I get:
2024-03-26 14:24:54.342734006 [E:onnxruntime:Default, provider_bridge_ort.cc:1534 TryGetProviderInfo_TensorRT] /onnxruntime_src/onnxruntime/core/session/provider_bridge_ort.cc:1209 onnxruntime::Provider& onnxruntime::ProviderLibrary::Get() [ONNXRuntimeError] : 1 : FAIL : Failed to load library libonnxruntime_providers_tensorrt.so with error: libnvinfer.so.8: cannot open shared object file: No such file or directory

Almost all the files under TensorRT-9.3.0.1/lib ends with .9; This indicates that these files are correctly built and correspond to version 9. However the pybind isn't configured correctly to use them.

I tried renaming the following files:
libnvinfer.so.9 -> libnvinfer.so.8
libnvinfer_plugin.so.9 -> libnvinfer_plugin.so.8
libnvonnxparser.so.9 -> libnvonnxparser.so.8

And this solved the problem!

Can you please fix the bindings for tensorRT9 and publish wheels to pypi ?

Commands or scripts:

Have you tried the latest release?:

Can this model run on other frameworks?
Yes this model loads without any problem with CUDA backend ("CUDAExecutionProvider", {}),

Metadata

Metadata

Assignees

Labels

triagedIssue has been triaged by maintainers

Type

No type

Projects

No projects

Milestone

No milestone

Relationships

None yet

Development

No branches or pull requests

Issue actions