Join GitHub today
GitHub is home to over 40 million developers working together to host and review code, manage projects, and build software together.Sign up
TensorRT 6.0 ONNX_Parser doesn't support the ONNX model exported by PyTorch 1.3.1 #376
TensorRT 6.0 ONNX Parser doesn't support the ONNX model exported by PyTorch 1.3.1.
The TensorRT ONNX Parser seems not well compatible with the new PyTorch version. 1.3 or 1.4.
I have a Jetson TX2 (Jetpack 4.3, TensorRT6) to deploy my model.
However, it will tell when building the engine that
Although there are some issues related to this error, e.g. #319, #286, but they did not take the PyTorch version into account. So here I point it out. This issue is open for reminding those who have the same problem but don't know how to solve.
At the very beginning, I thought to change to opset 7 may help as it is mentioned at the TensorRT doc
Then, I tried
When I used PyTorch 1.3.1, the problem was still there, and the size of the exported model is 13,599 KB.
Therefore. using PyTorch 1.2 may help you solve the problem (1.1 also)
TensorRT Version: 6.0.1