-
Notifications
You must be signed in to change notification settings - Fork 2.3k
Closed
Labels
Module:Embeddedissues when using TensorRT on embedded platformsissues when using TensorRT on embedded platformstriagedIssue has been triaged by maintainersIssue has been triaged by maintainers
Description
Description
I am trying to use TensorRT for instance segmentation application. For this I have saved the built engine and subsequently when I try to load the saved .engine file it says "Deserialization Failed. Internal error. Magic Tag assertion failed."
Interstingly, if I do not save the engine first but rather build and run inference at once consequitively, it does not throw any error.
For the record, my environment remains the same while building and saving the engine as well while running inferences.
Environment
jetpack 4.6 on Xavier Nx with CUDA 10.2, Linux 18.04, Tf 1.15, Python 3.6 and TensorRT 8.2
Reactions are currently unavailable
Metadata
Metadata
Assignees
Labels
Module:Embeddedissues when using TensorRT on embedded platformsissues when using TensorRT on embedded platformstriagedIssue has been triaged by maintainersIssue has been triaged by maintainers