Join GitHub today
GitHub is home to over 40 million developers working together to host and review code, manage projects, and build software together.Sign up
DeepLab with TensorFlow Mobile or TensorFlow Lite #4278
We just want to run this modal on Android. We have tried two approach TensorFlow Mobile and TensorFlow Lite.
With TensorFlow Mobile, we download the pre-trained modals with MobileNetV2:
We can successfully load the modal, but when run the inference, we get the following error:
I think this is caused by the output node "SemanticPredictions" call the operation Slice with INT64 data. This is not supported by TensorFlow Mobile yet.
With TensorFlow Lite, we use the following command to convert is to tflite format:
We get the following warnings:
The model could not be loaded successfully. I think it is caused the warning:
Is it possible to update node SemanticPredictions to use INT32 data type on Slice operation? Or do you have any suggestion on how to run it with TensorFlow lite?
Thanks for bringing up this issue.
We will look into this issue more carefully in the near future.
I modify the line in export_model.py (around line 131):
add tf.cast to cast predictions to int32. And then export model with the following command:
Here I only pointed the checkpoint path and export path, rest parameters kept with default value.
Now running the model with Tensorflow Mobile is successful!
The new issue is that the output array (SemanticPredictions) is a zero array, all the elements are 0.
Do you have any suggestion? Is it caused by some unsupported op in the model? Or should I add more parameters during exporting the model?
Now it works on my mobile devices. I add model variant parameter during export model and also fix a dimension issue of passing the width and heigh in wrong order during the inference.
Thanks for your help. I will accomplish it as complete demo.
how do you solve the issue, because have tried to convert the pb file to tflite by using tf converter