New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
object detection api toco model conversion problem #5298
Comments
@gargn -- can you take a look at the toco conversion error detailed above? |
Adding @achowdhery who works on the Object Detection model. |
FPN model support is still pending on our end. We have noted this feature request and will keep you updated on adding support for it next 4 weeks. |
Hi, @achowdhery Has the problem been solved? |
Hi, @achowdhery do fpn supports toco conversion now? |
Hi, @achowdhery! Does toco conversion support FPN now? |
Any update on FPN model support @achowdhery? When I train ssd_mobilenet_v1_fpn
But, when invoking the tfilite model on mobile or in python, I receive a Fatal signal 6 (SIGABRT). The same happens when using the frozen_inference_graph supplied with the download from the model zoo. Everything works when I use ssd_mobilenet_v1_coco instead of ssd_mobilenet_v1_fpn. |
@maxcrous If you are able to visualize the TF Lite after exporting, that would be extremely helpful in understanding and debugging the problem in open source version. Please share the visualization (TF Lite file can be visualized in Netron app) or you can use this tool (https://github.com/tensorflow/tensorflow/blob/master/tensorflow/lite/tools/visualize.py) |
@achowdhery, thank you for the quick reply. Link to Netron image for ssd_mobilenet_v1_coco: Link to Netron image for ssd_mobilenet_v1_fpn: |
This is my system information: OS Platform and Distribution: MacOS Mojave 10.14.1 |
In the export script, Can you please try turning off the addition of postprocessing op in the FPN model to see if the SIGABRT is in the main graph or postprocessing op? |
When setting The model is then successfully converted to a tflite model with the following command.
The resulting tflite model still produces a Fatal signal 6 (SIGABRT). The Netron image for the tflite model can be found here: |
When using tensorflow 1.11.0 (same result as tensorflow 1.12.0)
As stated in issue #4826, when using tensorflow 1.10.0
|
Yes, it was probably not possible to convert this until v1.12. I am still trying to understand why there is SIGABT if it converts. Please attach or email the frozen graph and tflite file. The converted tflite file should run. |
Here is the model straight out of the model zoo after Thanks again for the helpfulness. |
Please also add the tflite_convert command you used. I will try to repro the bug. And please add the stack trace for SIGABRT. |
For
Do note that the the postprocessing operations have been disregarded. Then I
The stack trace for the SIGABRT can be found here: |
The bug seems to be with Mul op. We will look in to this in the next few days. We sincerely appreciate your reporting the same. |
do you solve the problem?@maxcrous |
Hey @hxtkyne, I don't have any knowledge of the tflite conversion process, so we will have to wait for the good people at Tensorflow to fix this one. |
I used TF object detection API to train ssd_resnet_50_fpn_coco with a 50-classes dataset.
But the tflite model detect wrong class, bbox. All the output classes are the same (1 class). |
Ran into this yesterday. Anybody know the progress on this? |
@oopsodd used the weight_shared_convolutional_box_predictor in ppnnet,the tflite model detect wrong class, bbox. All the output classes are the same (1 class) too. I wonder if the convert tool support weight_shared_convolutional_box_predictor well? |
@achowdhery any update on the FPN model support? |
still waiting good news on FPN model support, anyone gets any update? |
Please make it a priority to add FPN support! Everybody needs this. |
@oopsodd I also use TF object detection API to train ssd-mobilenet_v2 model and use export_tflite_ssd_graph.py convert ckpt model to .pb file.and the .pb file also works well, but when I use bazel run --config=opt tensorflow/lite to convert .pb to .tflite, there is some errors, if I need compile tensorflow soure use bazel tools and then can use this command to convert .pb file to tflite, and how do you compile tensorflow use bazel ,thank you |
I have exactly the same problem with ssd fpn mobile. Same setup, same command line. Use Android Studio, keep crashing same SIGABRT error. While ssd_mobile_v2 and ssd_inception_v2 both work fine in both FLOAT and QUANTIZED_UINT8 mode. |
Any update here? |
how to solve this problem? i am still waiting ... |
Any update? My fpn model still not compilable by edgetpu |
TOCO is deprecated |
邮件已收到了哦亲,经常联系啊。好朋友走一生
|
System information
Describe the problem
i am trying to create a fully quantized tflite model for inference
while trained this model from scratch with custom dataset there was a problem related to #5139 but i used a workaround to increase eval delay and restarted process few times so this problem was just slowed down training process
finnaly model was trained and works fine with .pb file created by
export_inference_graph.py
to create tflite file i followed this instructions https://github.com/tensorflow/models/blob/master/research/object_detection/g3doc/running_on_mobile_tensorflowlite.md
exported tflite_graph.pb without a problem
but when converting it to the tflite with toco it crashes:
results in
tensorflow/contrib/lite/toco/graph_transformations/propagate_fixed_sizes.cc:116] Check failed: dim_x == dim_y (256 vs. 24)Dimensions must match
Source code / logs
toco log:
export_tflite_ssd_graph.py log:
config:
The text was updated successfully, but these errors were encountered: