-
Notifications
You must be signed in to change notification settings - Fork 45.7k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
[deeplab + cityscape] Frozen inference graph provided is slower than a self-exported graph. #4525
Comments
Hello, |
The CityScapes frozen graph model converted to |
Hi There, |
@cheneeheng How did you export the model using the existing checkpoints ? |
Please go to Stack Overflow for help and support:
http://stackoverflow.com/questions/tagged/tensorflow
Also, please understand that many of the models included in this repository are experimental and research-style code. If you open a GitHub issue, here is our policy:
Here's why we have that policy: TensorFlow developers respond to issues. We want to focus on work that benefits the whole community, e.g., fixing bugs and adding features. Support only helps individuals. GitHub also notifies thousands of people when issues are filed. We want them to see you communicating an interesting problem, rather than being redirected to Stack Overflow.
System information
You can collect some of this information using our environment capture script:
https://github.com/tensorflow/tensorflow/tree/master/tools/tf_env_collect.sh
You can obtain the TensorFlow version with
python -c "import tensorflow as tf; print(tf.GIT_VERSION, tf.VERSION)"
Describe the problem
(Although both are running less than the 5s runtime given in model_zoo.md)
I have used the official codes to do the export. Arguments passed to it is shown below :
--checkpoint_path="/path/to/model.ckpt"
--export_path="/path/to/frozen_inference_graph.pb"
--model_variant="xception_65"
--atrous_rates=6
--atrous_rates=12
--atrous_rates=18
--output_stride=16
--decoder_output_stride=4
--num_classes=19
--crop_size=1025
--crop_size=2049
--inference_scales=1.0
The only change i made is pulling the inference code out of ipynb, added time.time() for timing, and added some util function to loop through the directory of images.
A quick check with tensorboard shows that my exported graph has just 1216 nodes compared to the 1311 nodes in the graph provided.
Question:
Thank you.
Source code / logs
Include any logs or source code that would be helpful to diagnose the problem. If including tracebacks, please include the full traceback. Large logs and files should be attached. Try to provide a reproducible test case that is the bare minimum necessary to generate the problem.
The text was updated successfully, but these errors were encountered: