Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Segmentation with Deeplabv3 Cityscape Model #2

Open
titanbender opened this issue Mar 21, 2020 · 1 comment
Open

Segmentation with Deeplabv3 Cityscape Model #2

titanbender opened this issue Mar 21, 2020 · 1 comment

Comments

@titanbender
Copy link

titanbender commented Mar 21, 2020

Dear Khanh,

Thanks for sharing your sample code! The android and colab code are very useful!

My question is related to the configuration to other segmentation models.

More specifically, I’d like to test with the deeplabv3 model on cityscapes data and have identified the following model on TF model Zoo mobilenetv3_small_cityscapes_trainfine.

To convert the model to TF lite, I’ve used the following command:

tflite_convert \
  --output_file=./deeplabv3_city.tflite \
  --graph_def_file=./frozen_inference_graph.pb \
  --input_arrays=ImageTensor \
  --output_arrays=ExpandDims_1 \
  --input_shapes=1,257,257,3 \
  --inference_input_type=QUANTIZED_UINT8 \
  --inference_type=FLOAT \
  --mean_values=128 \
  --std_dev_values=127

My question is two-fold:

  1. When testing the converted model with your Colab, I get a ‘set tensor error’ with the following message Got tensor of type 0 but expected type 3 for input 3 regardless of the data type that I input (uint8, float32, int32, etc.) Do you have an intuition in terms of where the error could lie?


  2. The goal would be to run the model on android with your sample code on segmentation. Do you have any advice on how the sample code should to be modified in order to run a different model like this?

Thank you for your time.

Sincerely,
Johan

@khanhlvg
Copy link
Owner

I looked at the model you want to use but I saw some non-trivial errors running the model. Deeplab v3 requires TF 1 but TF Lite have stopped development on TF 1 completely and moved to TF 2 so there are some hacks you need to do to convert properly.

  1. Use TF 1.15 to export pretrained model as SavedModel format.
    Change this export script to export to SavedModel format instead of frozen graph format. You can see an example here. Make sure to use fixed size input instead of dynamic (aka. shape = None) input.
  2. Use TF 2 TFLiteConverter to convert the saved model to TF Lite.
  3. Use TF 2 TF Lite Interpreter Python API to test if it works.

Please try the steps and if you see errors, please share the reproducible steps as a Colab notebook and I'll see if I can help you troubleshoot.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants