-
Notifications
You must be signed in to change notification settings - Fork 45.9k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Error while exporting inference graph. #2861
Comments
some error on export_inference_graph.py (tf1.4, cuda9) File "export_inference_graph.py", line 119, in |
Getting the same error. Any solutions? |
I resolved it by removing "optimize_tensor_layout=True" from exporter.py line 71. It looks like a tensorflow bug but for now that worked. |
tks mahyarr, remove "optimize_tensor_layout=True" from exporter.py line 71. solved!!! |
Change the parameter on line 71 to |
For me it still throws the error even though I'm using the current standard value for the parameter, any tips? |
on ubuntu 16.04 with latest version of everything involved installed I got the same error: ValueError: Protocol message RewriterConfig has no "layout_optimizer" field. I changed "layout_optimizer" to "optimize_tensor_layout" on line 72 in exporter.py, and then it worked. (I'm guessing that RewriterConfig needs to be updated if the desired name of the field is "layout_optimizer" and not "optimize_tensor_layout") |
Hmm, I tried changing line 72 to that but I still get the exact same error lol. There has to be some weird stuff going on for me right now... |
@frostell that work, thank you!!! |
On Ubuntu 16.04; Changing layout_optimizer in export.py didn't fix the problem, I still get the error: ValueError: Protocol message RewriterConfig has no "layout_optimizer" field. @frostell How can I update RewiterConfig? Thanks in advance |
thanks @frostell - faced the same problem and it worked well for me! |
Thank you @frostell 🤗 The fix you suggested did it for me too. During my initial setup, I had to run the following commands referenced from the link below: cd model/research/
python setup.py build
python setup.py install So, once I applied the fix you suggested, and re-run ^ commands to get the export working. |
Using the stock tensorflow models clone I'm getting the same error pretty much everybody else here is describing. While attempting to follow this tutorial series: https://www.youtube.com/playlist?list=PLQVvvaa0QuDcNK5GeCQnxYnSSaar2tpku In the 6th video, when I run export_inference_graph.py with the stock tensorflow models clone, I get this error:
If I change models/research/object_detection/exporter.py line 71/72 from
to
and I don't change anything else whatsoever and run it again then it works. I'm running Windows 10, TensorFlow 1.4, and I cloned tensorflow models and performed the protobuf compile this past weekend so I'm confident they are both current. Somebody at google please fix this asap. |
I also changed "layout_optimizer" to "optimize_tensor_layout" on line 72 in exporter.py, and then it worked. However, why has this problem not been fixed? Thanks, |
ok, seems like the RewriterConfig has been changed in tensorflow 1.5 and the small modification detailed above will make it brake in the same manner but with the opposite result (I had to change back from the small mod when I updated to 1.5) what will work is: Tensorflow version 1.4 Tensorflow version 1.5 |
@frostell thanks that fixed it! |
@frostell, thank you that fixed it on Ubuntu 16.04.01 LTS |
Closing as this is resolved |
After pulling repo (0cc9862) got an error on object_detection/export_inference_graph.py call.
Error message:
TF version: latest nightly build
The text was updated successfully, but these errors were encountered: