[TF Lite] How to Convert Tensorflow Model to TF Lite Model
MyungJoo Ham edited this page Oct 8, 2018
·
5 revisions
This is the basic guide for converting tensorflow
model to tensorflow lite
model through TOCO.
We can get pre-trained tensorflow
model at the offcial document.
In this guide, ssd_mobilenet_v2_coco will be used.
curl http://download.tensorflow.org/models/object_detection/ssd_mobilenet_v2_coco_2018_03_29.tar.gz | tar xzv -C ${PATH_TO_TF_MODEL}
$ git clone https://github.com/tensorflow/tensorflow.git
../tensorflow$ bazel run --config=opt //tensorflow/contrib/lite/toco:toco -- --input_file=${PATH_TO_TF_MODEL}/frozen_inference_graph.pb --output_file=${PATH_TO_TFLITE_MODEL}/ssd_mobilenet_v2_coco.tflite --inference_type=FLOAT --input_shape=1,300,300,3 --input_array=Preprocessor/sub --output_arrays=concat,concat_1
Description of the option
- --input_shape: The dimension of input image tensor
- --input_array: Set the input tensor from tensorflow graph
- --output_arrays: Set the output tensor from tensorflow graph
To set above options in person, you should know the graph structure of the tensorflow model.
If we want to use the whole models, with original in/out tensor, we have to implement the custom operator.
In this guide, we will set input/output tensor in person for not using custom operators.
1. SSL exception
If you confront with below log, it could be related with proxy.
...
ERROR: /home/tensorflow/tensorflow/tools/pip_package/BUILD:100:1: no such package
'@eigen_archive//': Error downloading [https://bitbucket.org/eigen/eigen/get/default.tar.gz]
to /home/.cache/bazel/_bazel_david/9a6a3e878a69de23dea27378a84b120f/external/eigen_archive/default.tar.gz:
sun.security.validator.ValidatorException: PKIX path building failed:
sun.security.provider.certpath.SunCertPathBuilderException:
unable to find valid certification path to requested target and referenced by
'//tensorflow/tools/pip_package:licenses'.
ERROR: Analysis of target '//tensorflow/tools/pip_package:build_pip_package' failed; build aborted.
...
The simplest way to solve this problem is creating a simple file server with python.
- get the files in person
$ wget https://bitbucket.org/eigen/eigen/get/default.tar.gz -P ${PATH_TO_STORE}
- run simple python ftp server at localhost
$ python -m SimpleHTTPServer 8000
- edit
../tensorflow/tensorflow/workspace.bzl
appendhttp://localhost:8000/default.tar.gz
at the archive address
...
tf_http_archive(
name = "eigen_archive",
urls = [
"http://localhost:8000/default.tar.gz",
...
],
sha256 = "d956415d784fa4e42b6a2a45c32556d6aec9d0a3d8ef48baee2522ab762556a9",
strip_prefix = "eigen-eigen-fd6845384b86",
build_file = clean_dep("//third_party:eigen.BUILD"),
)
...