Skip to content

tensorrt8.2.3 inference time is 5ms, but 8.4.3 inference time is 80ms #2377

@ywfwyht

Description

@ywfwyht

Description

Hi, guys.
After converting the onnx to the trt engine with the link below,
The inference time is 5ms in trt8.2.3 and 80ms in trt8.4.3.

Environment

TensorRT Version: 8.4.3.1
NVIDIA GPU: 3080Ti
NVIDIA Driver Version: 515
CUDA Version: 11.6
CUDNN Version: 8.4
Operating System: ubuntu18.04
Python Version (if applicable): 3.8.8
Tensorflow Version (if applicable):
PyTorch Version (if applicable): 1.11
Baremetal or Container (if so, version):

Relevant Files

https://github.com/ywfwyht/onnx_model/blob/main/0903_p28_t3_seg_simp.onnx

Steps To Reproduce

Metadata

Metadata

Assignees

Labels

triagedIssue has been triaged by maintainers

Type

No type

Projects

No projects

Milestone

No milestone

Relationships

None yet

Development

No branches or pull requests

Issue actions