-
Notifications
You must be signed in to change notification settings - Fork 74.1k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Operations used for inference are dropped by optimize_for_inference #8242
Comments
I temporarily patched this issue in this special case by replacing the l_output = tf.multiply(l_dense, 1., name='output') But I think |
Just ran into this issue myself, but I feel like the bigger issue is that using Also I'd imagine that using |
I'm not sure of the underlying issue here, but I'm hoping that the new graph transform approach to removing unused nodes might be more robust? |
Since there's been no activity on this one for several weeks, I'm closing this for now. Please reopen with more information if this is incorrect. |
I've ran into the same issue with tensorflow 1.2.1. Optimized the default tiny-yolo-voc model from darkflow and the output node disappeared. |
Definitely a bug in both strip_unused and optimize_for_inference tools. Use transform_graph fixed this! See my answer at https://stackoverflow.com/questions/48212068/error-using-model-after-using-optimize-for-inference-py-on-frozen-graph/48638586#48638586 for detail. |
Inspired by the TensorFlow for Poets, I have been exporting models optimized for inference with the
freeze_graph
andoptimize_for_inference
. I have run into an issue where some of the nodes required for inference get dropped byoptimize_for_inference
. The most critical one being the output node being dropped, even though it was explicitly given tofreeze_graph
andoptimize_for_inference
(through theoutput_node_name
/output_names
).I think that might be related to the output node being a
tf.identity
(to give an explicit name to the result of atf.layers
for example).Minimal working example
Here is a piece of code to create a very simple model, running on TensorFlow v.1.0.1.
I am exporting the model using the
freeze_graph
andoptimize_for_inference
tools, inspired by the TensorFlow for Poets post.I am using Python to load both of these models (
graph_frozen.pb
andgraph_optimized.pb
). The model defined bygraph_frozen.pb
works as expected, but the model defined bygraph_optimized.pb
is missing some operations (import/dense/BiasAdd
andimport/output
).Environment info
tensorflow-1.0.1-cp27-cp27m-macosx_10_11_x86_64.whl
)freeze_graph
andoptimize_for_inference
with bazel (version 0.4.3-homebrew) in 100552fThe text was updated successfully, but these errors were encountered: