New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Python converted trt used in C++ #16
Comments
Hi Wallace00V, Thanks for reaching out! Yes, you can by serializing the TensorRT engine and executing using the TensorRT C++ runtime. Assume you optimized your model by calling model_trt = torch2trt(model, [data]) The with open('model_trt.engine', 'wb') as f:
f.write(model_trt.engine.serialize()) This serialized engine may be used with the TensorRT C++ runtime as described in the TensorRT Developer Guide. If you're unfamiliar with using TensorRT directly, please let me know and I'd be happy to help. Best, |
@jaybdub |
@jaybdub all right, i figured it out just like the following:
|
@jaybdub
How to fix that? |
I have the same error that is occured in deserializing the engine in C++ ERROR: Cannot deserialize plugin interpolate. |
have the same error that is occured in deserializing the engine in C++. Do you solve the problem? @TheLittleBee @donghoonsagong |
same problem , may be need a pluginFactory, have you solve the problem? |
same problem I believe: [TensorRT] ERROR: getPluginCreator could not find plugin interpolatetorch2trt version 1 namespace torch2trt |
@skyler1253 @xieydd @Raneee @donghoonsagong @TheLittleBee @Wallace00V @jaybdub |
@skyler1253 @xieydd @Raneee @donghoonsagong @TheLittleBee @Wallace00V @jaybdub @crazybob23 |
@skyler1253 @xieydd @Raneee @donghoonsagong @TheLittleBee @Wallace00V @jaybdub @crazybob23 @zbz-cool |
i had converted the model from pytorch to tensorrt( by torch.save( .pth ) ), can i used it in c++? how?
The text was updated successfully, but these errors were encountered: