-
Notifications
You must be signed in to change notification settings - Fork 74k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Failed to import TRTEngineOp #26525
Comments
@lu814478913 Could you provide more details on the bug and the steps leading to the bug? It would be helpful If you can share a code to reproduce the bug. Thanks! |
Thanks for your reply. I just use tf.saved_model.loader.load to load the tftrt converted model. But it does not work. I have to explicitly import tftrt to load model successfully. |
Hi @lu814478913, That is the intended behavior. Because TRT support is optional when using TF GPU pip package, meaning you don't need to install TRT before you can use TF GPU. So, to avoid loading possibly non-existing TRT library we don't import tftrt when In the future when we switch to dynamic loading of TRT library this problem will be solved, but for now if we want to load/run a tftrt converted model we'll need to import the tftrt module in addition to Thanks. |
I got the same issue, how to solve it?
@aaroey |
@Eloring That should be the only extra step in your python script. |
I'll close this issue for now, please let me know if it still doesn't work. |
add import: |
Please make sure that this is a bug. As per our GitHub Policy, we only address code/doc bugs, performance issues, feature requests and build/installation issues on GitHub. tag:bug_template
System information
You can collect some of this information using our environment capture script
You can also obtain the TensorFlow version with
python -c "import tensorflow as tf; print(tf.GIT_VERSION, tf.VERSION)"
Describe the current behavior
Failed to load a tensorrt trt_converted saved model.
But when I add extra import statements, the original code works. I must import tensorflow.contrib.tensorrt as trt, but this is a unused import for my code.
Describe the expected behavior
Code to reproduce the issue
Provide a reproducible test case that is the bare minimum necessary to generate the problem.
tf.saved_model.loader.load(sess, [tf.saved_model.tag_constants.SERVING], tensorrt_saved_mode_path)
Other info / logs
Include any logs or source code that would be helpful to diagnose the problem. If including tracebacks, please include the full traceback. Large logs and files should be attached.
The text was updated successfully, but these errors were encountered: