Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Failed to import TRTEngineOp #26525

Closed
luneew opened this issue Mar 10, 2019 · 8 comments
Closed

Failed to import TRTEngineOp #26525

luneew opened this issue Mar 10, 2019 · 8 comments
Assignees
Labels
comp:ops OPs related issues type:bug Bug

Comments

@luneew
Copy link

luneew commented Mar 10, 2019

Please make sure that this is a bug. As per our GitHub Policy, we only address code/doc bugs, performance issues, feature requests and build/installation issues on GitHub. tag:bug_template

System information

  • Have I written custom code (as opposed to using a stock example script provided in TensorFlow): custom code
  • OS Platform and Distribution (e.g., Linux Ubuntu 16.04): Linux Ubuntu 16.04
  • TensorFlow installed from (source or binary): binary
  • TensorFlow version (use command below): 1.13.1
  • Python version: 3.6.3
  • CUDA/cuDNN version:9.0
  • GPU model and memory:Tesla P4

You can collect some of this information using our environment capture script
You can also obtain the TensorFlow version with
python -c "import tensorflow as tf; print(tf.GIT_VERSION, tf.VERSION)"

Describe the current behavior
Failed to load a tensorrt trt_converted saved model.

    meta_graph_def = tf.saved_model.loader.load(sess, [tf.saved_model.tag_constants.SERVING], model_path)
  File "/usr/local/lib/python3.6/site-packages/tensorflow/python/util/deprecation.py", line 324, in new_func
    return func(*args, **kwargs)
  File "/usr/local/lib/python3.6/site-packages/tensorflow/python/saved_model/loader_impl.py", line 269, in load
    return loader.load(sess, tags, import_scope, **saver_kwargs)
  File "/usr/local/lib/python3.6/site-packages/tensorflow/python/saved_model/loader_impl.py", line 420, in load
    **saver_kwargs)
  File "/usr/local/lib/python3.6/site-packages/tensorflow/python/saved_model/loader_impl.py", line 350, in load_graph
    meta_graph_def, import_scope=import_scope, **saver_kwargs)
  File "/usr/local/lib/python3.6/site-packages/tensorflow/python/training/saver.py", line 1457, in _import_meta_graph_with_return_elements
    **kwargs))
  File "/usr/local/lib/python3.6/site-packages/tensorflow/python/framework/meta_graph.py", line 806, in import_scoped_meta_graph_with_return_elements
    return_elements=return_elements)
  File "/usr/local/lib/python3.6/site-packages/tensorflow/python/util/deprecation.py", line 507, in new_func
    return func(*args, **kwargs)
  File "/usr/local/lib/python3.6/site-packages/tensorflow/python/framework/importer.py", line 399, in import_graph_def
    _RemoveDefaultAttrs(op_dict, producer_op_list, graph_def)
  File "/usr/local/lib/python3.6/site-packages/tensorflow/python/framework/importer.py", line 159, in _RemoveDefaultAttrs
    op_def = op_dict[node.op]
KeyError: 'TRTEngineOp'

But when I add extra import statements, the original code works. I must import tensorflow.contrib.tensorrt as trt, but this is a unused import for my code.
Describe the expected behavior

Code to reproduce the issue
Provide a reproducible test case that is the bare minimum necessary to generate the problem.
tf.saved_model.loader.load(sess, [tf.saved_model.tag_constants.SERVING], tensorrt_saved_mode_path)
Other info / logs
Include any logs or source code that would be helpful to diagnose the problem. If including tracebacks, please include the full traceback. Large logs and files should be attached.

@jvishnuvardhan jvishnuvardhan self-assigned this Mar 11, 2019
@jvishnuvardhan jvishnuvardhan added comp:ops OPs related issues type:bug Bug stat:awaiting response Status - Awaiting response from author labels Mar 11, 2019
@jvishnuvardhan
Copy link
Contributor

@lu814478913 Could you provide more details on the bug and the steps leading to the bug? It would be helpful If you can share a code to reproduce the bug. Thanks!

@luneew
Copy link
Author

luneew commented Mar 12, 2019

Thanks for your reply. I just use tf.saved_model.loader.load to load the tftrt converted model. But it does not work. I have to explicitly import tftrt to load model successfully.

@tensorflowbutler tensorflowbutler removed the stat:awaiting response Status - Awaiting response from author label Mar 13, 2019
@jvishnuvardhan jvishnuvardhan added the stat:awaiting tensorflower Status - Awaiting response from tensorflower label Mar 20, 2019
@aaroey aaroey assigned aaroey and unassigned k-w-w Apr 25, 2019
@aaroey
Copy link
Member

aaroey commented Apr 25, 2019

Hi @lu814478913,

That is the intended behavior. Because TRT support is optional when using TF GPU pip package, meaning you don't need to install TRT before you can use TF GPU. So, to avoid loading possibly non-existing TRT library we don't import tftrt when import tensorflow is called.

In the future when we switch to dynamic loading of TRT library this problem will be solved, but for now if we want to load/run a tftrt converted model we'll need to import the tftrt module in addition to import tensorflow.

Thanks.

@tensorflowbutler tensorflowbutler removed the stat:awaiting tensorflower Status - Awaiting response from tensorflower label Apr 26, 2019
@EsmeYi
Copy link

EsmeYi commented Jun 28, 2019

I got the same issue, how to solve it?

but for now if we want to load/run a tftrt converted model we'll need to import the tftrt module in addition to import tensorflow.

@aaroey
What should be done after importing tftrt module?

@aaroey
Copy link
Member

aaroey commented Jul 10, 2019

@Eloring That should be the only extra step in your python script.

@aaroey
Copy link
Member

aaroey commented Jul 10, 2019

I'll close this issue for now, please let me know if it still doesn't work.

@aaroey aaroey closed this as completed Jul 10, 2019
@tensorflow-bot
Copy link

Are you satisfied with the resolution of your issue?
Yes
No

@Tuanlase02874
Copy link

add import:
from tensorflow.python.compiler.tensorrt import trt_convert as trt

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
comp:ops OPs related issues type:bug Bug
Projects
None yet
Development

No branches or pull requests

7 participants