New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Failed to reproduce frozen inference graph as in models zoo #5640
Comments
@wt-huang we are able to reproduce this same issue on multiple tensorflow versions. Any update on this ? |
@wt-huang any update on this? |
Did someone get around this? |
Not that I know of.
…On Wed, Mar 27, 2019, 11:28 AM visignibraem ***@***.***> wrote:
Did someone get around this?
—
You are receiving this because you commented.
Reply to this email directly, view it on GitHub
<#5640 (comment)>,
or mute the thread
<https://github.com/notifications/unsubscribe-auth/AYo1gL-rRWWMcfoYDdq9fdJJFnfd8Ufiks5va44EgaJpZM4YBzx5>
.
|
INPUT_TYPE=image_tensor
|
@Tantael commenting out "# override_base_feature_extractor_hyperparams: true" in pipeline config, throws the following error
and without comment, the graph is not reproduced. |
Have you checked out commit I described? |
Yes @Tantael |
@siddas27 Did you solve the issue?? I want to run inference using C++ sampleUffSSD.cpp in TensorRT. |
System information
Describe the problem
Describe the problem clearly here. Be sure to convey here why it's a bug in TensorFlow or a feature request.
These are the exact steps I am following:
ssd_inception_v2
model with the following command:The above command generates the frozen graph format of the exported model. The issue is - The custom_ssd_inception.tar.gz is way different from what is given in the ssd_inception_v2_coco_2018_01_28.tar.gz file in model zoo of object detection when visualized using tensorboard.
Source code / logs
Include any logs or source code that would be helpful to diagnose the problem. If including tracebacks, please include the full traceback. Large logs and files should be attached. Try to provide a reproducible test case that is the bare minimum necessary to generate the problem.
Link to the config file: ssd_inception_v2_coco.config
When continued to convert the frozen graph to UFF using convert_to_uff.py, the output log files are as follows:
For the original model, given in model zoo: original-output.log
For custom model: custom-output.log
Can someone help me identify the exact issue? Where am I going wrong?
The text was updated successfully, but these errors were encountered: