New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
TF2.3 Converting SSD Mobilenet v2 to tflite (tflite file size about 0.5kbytes) #9394
Comments
Thanks for response. I try to get tflite by way in this thread (only converting, without quantizing) and got same unusable tflite (500 bytes size) |
@mpa74 Are you using the latest TF nightly for TFLite conversion? These features were added recently, so I don't think they have made it to the stable version yet. |
Tested on nightly versions of tensorflow and tf object detection. Everything is OK. Thanks! I do:
|
Awesome. Thanks for confirming! |
Prerequisites
Please answer the following questions for yourself before submitting an issue.
1. The entire URL of the file you are using
https://github.com/tensorflow/models/tree/master/research/
2. Describe the bug
After successful training SSD mobilenet v2 fpn-320 model on my own data (inference testing on last checkpoint is OK), i used export_tflite_graph_tf2.py and Python API to get TFlite model. But as result tflite generated with 544 byte size and no inference possibility
3. Steps to reproduce
use labeled dataset for 1 class (more classes not tested)
download mobilenetv2 fpn 320 config from link
in config change: num_classes = 1, fine_tune_checkpoint: "/content/models/research/deploy/ssd_mobilenet_v2_fpnlite_320x320_coco17_tpu-8/checkpoint/ckpt-0", label_map_path and input path for train_input_reader and eval_input_reader
train model:
!python /content/models/research/object_detection/model_main_tf2.py \ --pipeline_config_path={pipeline_file} \ --model_dir={model_dir} \ --alsologtostderr \ --num_train_steps={12000} \ --sample_1_of_n_eval_examples=1 \ --num_eval_steps={500}
convert to savedmodel using last ckpt as described in link
python /content/models/research/object_detection/export_tflite_graph_tf2.py \ --pipeline_config_path /content/models/research/deploy/pipeline_file.config \ --trained_checkpoint_dir /content/training \ --output_directory tflite
get basic TFLite model
converter = tf.lite.TFLiteConverter.from_saved_model('tflite/saved_model/) converter.optimizations = [tf.lite.Optimize.DEFAULT] tflite_model = converter.convert() with open('model_optimize2.tflite', 'wb') as f: f.write(tflite_model)
TFLite generated without any error, but have size 544 bytes
4. Expected behavior
Much larger tflite file with inference possibility
5. Additional context
I have many warnings while converting checkpoint to saved graph
6. System information
The text was updated successfully, but these errors were encountered: