Skip to content

❓ [Question] How can I load a TensorRT model generated with trtexec? #1547

@javiabellan

Description

@javiabellan

❓ Question

How can I load into Pytorch a TensorRT model engine (.trt or .plan) generated with trtexec ?

I have the following TensorRT model engine (generated from a ONNX file) using the trtexec tool provided by Nvidia

trtexec --onnx=../2.\ ONNX/CLIP-B32-image.onnx \
        --saveEngine=../4.\ TensorRT/CLIP-B32-image.trt \
        --minShapes=input:1x3x224x224 \
        --optShapes=input:1x3x224x224 \
        --maxShapes=input:32x3x224x224 \
        --fp16

I want to load it into Pytorch for using the Pytorch's dataloader for fast batch ineference.

Metadata

Metadata

Assignees

No one assigned

    Labels

    questionFurther information is requested

    Type

    No type

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions