Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

TFLiteConverter unable to convert object detection model from export_tflite_ssd_graph.py #24910

Open
dkashkin opened this issue Jan 14, 2019 · 12 comments

Comments

@dkashkin
Copy link

commented Jan 14, 2019

System information

  • OS Platform and Distribution (e.g., Linux Ubuntu 16.04): Linux Ubuntu 16.04
  • TensorFlow installed from (source or binary): binary
  • TensorFlow version (or github SHA if from source): 1.13.0-dev20190114

Provide the text output from tflite_convert

Some of the operators in the model are not supported by the standard TensorFlow Lite runtime. If those are native TensorFlow operators, you might be able to use the extended runtime by passing --enable_select_tf_ops, or by setting target_ops=TFLITE_BUILTINS,SELECT_TF_OPS when calling tf.lite.TFLiteConverter(). Otherwise, if you have a custom implementation for them you can disable this error with --allow_custom_ops, or by setting allow_custom_ops=True when calling tf.lite.TFLiteConverter(). Here is a list of builtin operators you are using: ADD, CONCATENATION, CONV_2D, DEPTHWISE_CONV_2D, LOGISTIC, MUL, RESHAPE. Here is a list of operators for which you will need custom implementations: TFLite_Detection_PostProcess.
Traceback (most recent call last):
File "/usr/local/bin/toco_from_protos", line 11, in
sys.exit(main())
File "/usr/local/lib/python3.5/dist-packages/tensorflow/lite/toco/python/toco_from_protos.py", line 59, in main
app.run(main=execute, argv=[sys.argv[0]] + unparsed)
File "/usr/local/lib/python3.5/dist-packages/tensorflow/python/platform/app.py", line 125, in run
_sys.exit(main(argv))
File "/usr/local/lib/python3.5/dist-packages/tensorflow/lite/toco/python/toco_from_protos.py", line 33, in execute
output_str = tensorflow_wrap_toco.TocoConvert(model_str, toco_str, input_str)

Also, please include a link to a GraphDef or the model if possible.
ssd_mobilenet_v1_fpn_shared_box_predictor_640x640_coco14_sync_2018_07_03

Any other info / logs
I trained the above mentioned model on my images, and then exported the graph with the --add_postprocessing_opt option (which I assume adds the TFLite_Detection_PostProcess op):
python /tf/models/research/object_detection/export_tflite_ssd_graph.py
--pipeline_config_path /tf/notebooks/models/ssd_mobilenet_v1_fpn_shared_box_predictor_640x640_coco14_sync_2018_07_03/pipeline.config
--trained_checkpoint_prefix /tf/notebooks/model.ckpt
--output_directory /tf/notebooks/tflite
--add_postprocessing_op=true

Then I used the following Python code in order to convert the model to TFLite format:

import tensorflow as tf
graph_def_file = "/tf/notebooks/scripts/both_training/tflite/tflite_graph.pb"
input_arrays = ["normalized_input_image_tensor"]
output_arrays = ['TFLite_Detection_PostProcess','TFLite_Detection_PostProcess:1','TFLite_Detection_PostProcess:2','TFLite_Detection_PostProcess:3']
input_shapes = {"normalized_input_image_tensor" : [1, 640, 640, 3]}

converter = tf.lite.TFLiteConverter.from_frozen_graph(graph_def_file, input_arrays, output_arrays, input_shapes)
converter.target_ops = [tf.lite.OpsSet.TFLITE_BUILTINS,tf.lite.OpsSet.SELECT_TF_OPS]
tflite_model = converter.convert()
open("/tf/notebooks/tflite/model.tflite", "wb").write(tflite_model)

I tried this with and without target_ops=SELECT_TF_OPS and the convert() call always crashes due to TFLite_Detection_PostProcess operation. The TFLite file is not created. Please help!

@ymodak ymodak added the comp:lite label Jan 15, 2019

@ymodak ymodak assigned liyunlu0618 and achowdhery and unassigned liyunlu0618 Jan 15, 2019

@achowdhery

This comment has been minimized.

Copy link

commented Jan 15, 2019

The FPN model is not supported yet.

@dkashkin

This comment has been minimized.

Copy link
Author

commented Jan 15, 2019

Ouch... I was using the official Tensorflow Detection Model Zoo and assumed that all the "official" models can be converted to TFLite. If this is not the case, I would highly recommend two things:

  1. Clearly indicate which models are compatible with TFLite on the list of models https://github.com/tensorflow/models/blob/master/research/object_detection/g3doc/detection_model_zoo.md (P.S. I really hope that all models that don't use FPN are supported. If there are some other exceptions please help me identify them!)

  2. TFLiteConverter must provide a specific message when it sees an unsupported model. The current message points to TFLite_Detection_PostProcess which is potentially confusing. If it said something like "FPN feature extractors are not supported by TFLite yet" that would be much more clear.

@dkashkin

This comment has been minimized.

Copy link
Author

commented Jan 17, 2019

Can you please confirm if TFLiteConverter is compatible with any Object Detection models besides the basic ssd_mobilenet? I am not satisfied with SSD models because their prediction accuracy is not nearly as good as FPN or RCNN models... Also, I noticed that the list of "hosted models" for TFLite does not include ANY object detection models: https://www.tensorflow.org/lite/models even though there is an official tutorial that successfully converts an old SSD Mobilenet model to TFLite: https://medium.com/tensorflow/training-and-serving-a-realtime-mobile-object-detector-in-30-minutes-with-cloud-tpus-b78971cf1193

@mrtehseen

This comment has been minimized.

Copy link

commented Apr 27, 2019

There is no support for complex custom models yet as I am also stuck in this some operators are not supported by the Tensorflow Lite yet like AsString AAnd and few more.

Take a look at this if you can make sense out of it somehow.
https://www.tensorflow.org/lite/guide/ops_select

@jdduke

This comment has been minimized.

Copy link
Member

commented Jul 15, 2019

We're actively working on support for FPN/RCNN models, and will have more to report in the near future. Apologies for any confusion, but for now only the SSD models are likely to convert cleanly. https://www.tensorflow.org/lite/models/object_detection/overview is the current sample for this.

@amitmate

This comment has been minimized.

Copy link

commented Jul 23, 2019

Thanks Jared. I have a trained SSD keras model (h5). I am trying to convert it to tflite using tflite converter. However, it complains:

converter = tf.lite.TFLiteConverter.from_keras_model_file(keras_model_file)

ValueError: Unknown layer: Normalize

Is there a way to add/register custom layers in python to TFLite converter ?

@gargn

This comment has been minimized.

Copy link
Member

commented Jul 23, 2019

Starting 1.14, TFLiteConverter.from_keras_model_file takes in the argument custom_objects which is passed directly into the Keras loading function. The logic should look something like the following:

converter = tf.lite.TFLiteConverter.from_keras_model_file(
   keras_model_file,
   custom_objects={'Normalize': Normalize})

I'm not sure where in the Keras library the Normalize layer is described. There are some normalization libraries under keras.layers.normalization. However, there is no Normalize there in 1.14.

If you need to use 1.13 then you can load the Keras model yourself and use TFLiteConverter.from_session with logic similar to the following:

tf.keras.backend.set_learning_phase(False)
keras_model = tf.keras.models.load_model(model_file, custom_objects)
sess = tf.keras.backend.get_session()
converter = tf.lite.TFLiteConverter.from_session(sess, keras_model.inputs, keras_model.outputs)
@amitmate

This comment has been minimized.

Copy link

commented Jul 24, 2019

Thanks Nupur. Now I am able to convert my custom keras SSD (with a custom Normalize layer) to tflite format. How can i incorporate the post processing layer (TFLite_Detection_PostProcess ) in the tflite model ? Can i add that custom post processing layer in python/keras without using bazel tool?

@gargn

This comment has been minimized.

Copy link
Member

commented Jul 24, 2019

In order to use the TFLite_Detection_PostProcess op, you need to:

  1. Freeze the graph.
  2. Use export_tflite_ssd_graph.py to add TFLite_Detection_PostProcess to the frozen graph. I don't know the specifics of this script. However, there are a handful of other bugs you might be able to reference if you run into issues with this step.

In order to freeze the graph there are two approaches you can try:

  • Save the model as a SavedModel. In 1.14 there is tf.keras.models.save_model. After getting the SavedModel, freeze the graph using freeze_graph.py which takes in a SavedModel. I am unfortunately not familiar enough with the internals of the SavedModel to know if this will have issues due to the Normalize layer.
  • Alternatively try using convert_variables_to_constants which takes in a Session. Try passing in the Keras session directly.
@val9299

This comment has been minimized.

Copy link

commented Jul 25, 2019

Did someone successfully convert a ssd model from model zoo and has now a tflite model with 4 outputs (detection_boxes, detection_classes, detection_scores, num_boxes)? I somehow only get a model with two outputs (raw_outputs/box_encodings, raw_outputs/class_predictions) when using export_tflite_ssd_graph.py (I also set add_postprocessing_op True). I opened a new issue on that: #31015
Thanks for helping!

@amitmate

This comment has been minimized.

Copy link

commented Jul 25, 2019

Thanks again Nupur.

Is there a python module for TFLite_Detection_PostProcess ? I can perhaps just add that as a custom keras output layer to my model. Would that work?

@gargn

This comment has been minimized.

Copy link
Member

commented Jul 25, 2019

Unfortunately there is no Python module for the op. To elaborate a bit, TFLite_Detection_PostProcess is a custom op that is only available in TensorFlow Lite - so once you run the export_tflite_ssd_graph.py script to add the op the model can't be loaded into the TensorFlow runtime.

We are working on adding support for control flow which should hopefully make models like SSD easier to convert. However, that is a longer term project.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
9 participants
You can’t perform that action at this time.