Skip to content

Issues: NVIDIA/TensorRT

Beta
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Author
Filter by author
Loading
Label
Filter by label
Loading
Use alt + click/return to exclude labels
or + click/return for logical OR
Projects
Filter by project
Loading
Milestones
Filter by milestone
Loading
Assignee
Filter by who’s assigned
Assigned to nobody Loading
Sort

Issues list

How to use custom ops (load xx.so for ort) with polygraph run --onnxrt ? Module:Polygraphy Issues with Polygraphy triaged Issue has been triaged by maintainers
#4487 opened Jun 12, 2025 by lzcchl
RFE: Support Ceil or Round operator on DLA Module:Embedded issues when using TensorRT on embedded platforms triaged Issue has been triaged by maintainers
#4486 opened Jun 11, 2025 by JoeCleary
inference takes longer time than previous version failure of TensorRT 10.3 when running trt model on GPU orin internal-bug-tracked Tracked internally, will be fixed in a future release. Investigating Issue is under investigation by TensorRT devs Module:Performance General performance issues triaged Issue has been triaged by maintainers
#4483 opened Jun 10, 2025 by JamesWang007
Export failure of TensorRT 10.11 when running scaled dot product on GPU A6000 Module:ONNX Issues relating to ONNX usage and import triaged Issue has been triaged by maintainers waiting for feedback Requires more information from author of item to make progress on the issue.
#4482 opened Jun 5, 2025 by evolvingai
HuggingFace DETR model export fails Module:Accuracy Output mismatch between TensorRT and other frameworks triaged Issue has been triaged by maintainers
#4477 opened Jun 3, 2025 by geiche735
failed to build the serialized network due to the wrong shape inference of the LayerNormalization operator Module:ONNX Issues relating to ONNX usage and import triaged Issue has been triaged by maintainers waiting for feedback Requires more information from author of item to make progress on the issue.
#4475 opened Jun 3, 2025 by coffezhou
IUnaryLayer cannot be used to compute a shape tensor Module:ONNX Issues relating to ONNX usage and import
#4474 opened May 30, 2025 by zhangzk0416
TensorRT produces wrong results when running valid onnx model on GPU 3080 Module:Accuracy Output mismatch between TensorRT and other frameworks triaged Issue has been triaged by maintainers
#4473 opened May 29, 2025 by coffezhou
TensorRT fails to infer the shape of the output for a valid onnx model. Module:ONNX Issues relating to ONNX usage and import triaged Issue has been triaged by maintainers
#4471 opened May 29, 2025 by coffezhou
"Internal Error: MyelinCheckException: gvn.cpp:318: CHECK(graph().ssa_validation()) failed." when building engine Module:Engine Build Issues with building TensorRT engines triaged Issue has been triaged by maintainers
#4468 opened May 28, 2025 by xjy1995
How can I adjust the position of quantization nodes to reduce data conversion? Module:Engine Build Issues with building TensorRT engines triaged Issue has been triaged by maintainers
#4467 opened May 27, 2025 by lzcchl
detectron2 faster rcnn to tensor rt Module:Engine Build Issues with building TensorRT engines waiting for feedback Requires more information from author of item to make progress on the issue.
#4466 opened May 26, 2025 by Kolkhoznyk
Why does img2img diffusion task not have quantization support? how to get it working with quantization? Module:Demo Issues regarding demos under the demo/ directory: Diffusion, DeBERTa, Bert triaged Issue has been triaged by maintainers
#4463 opened May 23, 2025 by varshith15
ConvNet FP8 support Module:Performance General performance issues triaged Issue has been triaged by maintainers
#4461 opened May 23, 2025 by AnnaTrainingG
An inference error occurred after converting ONNX to tensorrt Module:Accuracy Output mismatch between TensorRT and other frameworks triaged Issue has been triaged by maintainers
#4460 opened May 22, 2025 by wangbiao0
About using multi-process to execute two engine models in one program Module:Accuracy Output mismatch between TensorRT and other frameworks triaged Issue has been triaged by maintainers
#4459 opened May 21, 2025 by A-cvprogrammer
trtexec fails at the end if --saveEngine path is invalid or unwritable Feature Request Request for new functionality Module:Samples Issues when using TensorRT samples under the samples/ directory, including usage with trtexec triaged Issue has been triaged by maintainers
#4448 opened May 18, 2025 by PierreMarieCurie
TensorRT Plugin gets incorrect input data when integrated into full model, but works fine in isolation Module:Plugins Issues when using TensorRT plugins triaged Issue has been triaged by maintainers
#4440 opened May 13, 2025 by niubiplus2
ProTip! Type g p on any issue or pull request to go back to the pull request listing page.