Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[Object Detection TFlite] Converting ssdlite_mobilenet_v2_coco #5122

Closed
normandra opened this issue Aug 17, 2018 · 13 comments
Closed

[Object Detection TFlite] Converting ssdlite_mobilenet_v2_coco #5122

normandra opened this issue Aug 17, 2018 · 13 comments
Assignees
Labels
stat:awaiting model gardener Waiting on input from TensorFlow model gardener

Comments

@normandra
Copy link

normandra commented Aug 17, 2018

System information

  • ssdlite_mobilenet_v2_coco_2018_05_09
  • no
  • Host ubuntu 18.04, using dockerfiles from the repo
  • docker:
  • 1.10
  • Bazel version (if compiling from source):
  • commands to reproduce
object_detection/export_tflite_ssd_graph.py \
--pipeline_config_path=$CONFIG_FILE \
--trained_checkpoint_prefix=$CHECKPOINT_PATH \
--output_directory=$OUTPUT_DIR \
--add_postprocessing_op=true

bazel run --config=opt tensorflow/contrib/lite/toco:toco -- \
--input_file=$OUTPUT_DIR/tflite_graph.pb \
--output_file=$OUTPUT_DIR/detect.tflite \
--input_shapes=1,300,300,3 \
--input_arrays=normalized_input_image_tensor \
--output_arrays='TFLite_Detection_PostProcess','TFLite_Detection_PostProcess:1','TFLite_Detection_PostProcess:2','TFLite_Detection_PostProcess:3'  \
--inference_type=FLOAT \
--allow_custom_ops

Describe the problem

After retraining the model (http://download.tensorflow.org/models/object_detection/ssdlite_mobilenet_v2_coco_2018_05_09.tar.gz) I went through the steps to produce a tflite version to run on android. As I understand the model is a float model and I converted it as such. While the conversion worked after testing on android the results is just wrong. Not only is it random, it also gives negative decimal numbers for the class nodes. Is this model currently not supported for tflite or have I possibly made a mistake somewhere?

The code for android is taken from the tflite demo, set to float (unquantized)

upon request I can upload the model ( the whole thing or just specific part )

EDIT: tf version is 1.10 not 1.1

@tensorflowbutler tensorflowbutler added the stat:awaiting response Waiting on input from the contributor label Aug 17, 2018
@tensorflowbutler
Copy link
Member

Thank you for your post. We noticed you have not filled out the following field in the issue template. Could you update them if they are relevant in your case, or leave them as N/A? Thanks.
What is the top-level directory of the model you are using
Have I written custom code
OS Platform and Distribution
TensorFlow installed from
TensorFlow version
CUDA/cuDNN version
GPU model and memory
Exact command to reproduce

@normandra
Copy link
Author

normandra commented Aug 20, 2018

Screwed up the formatting sorry,

System information

  • What is the top-level directory of the model you are using:
    ssdlite_mobilenet_v2_coco_2018_05_09
  • Have I written custom code (as opposed to using a stock example script provided in TensorFlow):
    no
  • OS Platform and Distribution (e.g., Linux Ubuntu 16.04):
    Linux Ubuntu 18.04 as host, using dockerfiles from object_detection
  • TensorFlow installed from (source or binary):
    from dockerfiles, also tested by building from source
  • TensorFlow version (use command below):
    docker says 1.10 otherwise 1.10
  • Bazel version (if compiling from source):
    0.16
  • CUDA/cuDNN version:
    none
  • GPU model and memory:
    none / trained in a server with 4 titan xp 12gb memory each
  • Exact command to reproduce:

object_detection/export_tflite_ssd_graph.py \
--pipeline_config_path=$CONFIG_FILE \
--trained_checkpoint_prefix=$CHECKPOINT_PATH \
--output_directory=$OUTPUT_DIR \
--add_postprocessing_op=true

bazel run --config=opt tensorflow/contrib/lite/toco:toco -- \
--input_file=$OUTPUT_DIR/tflite_graph.pb \
--output_file=$OUTPUT_DIR/detect.tflite \
--input_shapes=1,300,300,3 \
--input_arrays=normalized_input_image_tensor \
--output_arrays='TFLite_Detection_PostProcess','TFLite_Detection_PostProcess:1','TFLite_Detection_PostProcess:2','TFLite_Detection_PostProcess:3'  \
--inference_type=FLOAT \
--allow_custom_ops

@normandra
Copy link
Author

as a side note, the model ran on pc and tfmobile

@tensorflowbutler tensorflowbutler removed the stat:awaiting response Waiting on input from the contributor label Aug 20, 2018
@derekjchow
Copy link
Contributor

Mobilenet V2 should be supported by TFLite.

Looking at the pipeline.config file for this model, I note that apply_sigmoid_to_scores: false is present. This suggests that class scores are represented by logits in the model (not probabilities).

@normandra
Copy link
Author

Oh I didn't know ssdlite mobilenet v2 uses logit. But does this also explain why the class nodes is outputting negative decimal numbers? I understand that I can then convert logit output nodes for the scores into probabilities but what does this mean for the class nodes?

@derekjchow
Copy link
Contributor

Try applying logits to both.

@normandra
Copy link
Author

I'm sorry @derekjchow but I'm kinda lost here. As I understand logits are just different representation of probabilties ranging from R - infinity to infinity which I can then convert to a distribution from 0 to 1. What do you mean by applying logits to both? how am I supposed to apply logit to the class nodes?

@normandra
Copy link
Author

Can anyone confirm whether this is an issue with the API or with my implementation?

@normandra
Copy link
Author

I was able to get this to work today by using the config files inside research/object_detection/samples/configs/ssdlite_mobilenet_v2_coco.config After comparing the two config file the only difference that I can find is some numbers are ever so slightly different, for example droupout_keep_probablity was 0.8 in the samples and it was 0.800000011921 in the file I got from the model zoo. I'm not sure wether this has been mentioned somewhere before.

@normandra normandra reopened this Aug 27, 2018
@normandra
Copy link
Author

seems like the model still outputs negative index sometimes, but for the most part a simple if statement to check seems to do the job for now

@yhliang2018 yhliang2018 added the stat:awaiting model gardener Waiting on input from TensorFlow model gardener label Aug 31, 2018
@JDanielWu
Copy link

@normandra I get a negative location bbox, and the score is very different with PC model, I turn to ssd_mobilenet_v1 and test, hope will have a fine resut.

@normandra
Copy link
Author

@WuDanFly yeah I ended up using that aswell. What mentioned before by derekjchow is probably regarding the export model part, which I only understand now due to my lack of understanding of ml / tensorflow in general. I'm just gonna assume that doing so will probably fix the problem.

@JDanielWu
Copy link

@normandra Finally I found the cause of the problem. I used the C++ API in tflite and python in pb model,
I suspect that the input data may be inconsistent in c++ and python, C++ use opencv to read image but python use PIL Image. So I fill the input data into txt and check both of them, and find the different. I make a mistake, opencv read a image in bgr, and PIL is rgb, so I change the data order get a right result. Hope this may help others.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
stat:awaiting model gardener Waiting on input from TensorFlow model gardener
Projects
None yet
Development

No branches or pull requests

5 participants