Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

TFLite New Converter with 1.X frozen graph #38388

Closed
mhs4670go opened this issue Apr 9, 2020 · 3 comments
Closed

TFLite New Converter with 1.X frozen graph #38388

mhs4670go opened this issue Apr 9, 2020 · 3 comments
Assignees
Labels
comp:lite TF Lite related issues stat:awaiting tensorflower Status - Awaiting response from tensorflower TF 2.0 Issues relating to TensorFlow 2.0 TFLiteConverter For issues related to TFLite converter type:support Support issues

Comments

@mhs4670go
Copy link

I tried to covert 1.X frozen graph with TFLite New Converter following this guidelines(https://www.tensorflow.org/guide/migrate#a_graphpb_or_graphpbtxt).

Here is my full python script.

graph_def = tf.compat.v1.GraphDef()
graph_def.ParseFromString(open(flags.input_path, 'rb').read())

wrap_func = wrap_frozen_graph(
    graph_def,
    inputs=[_str + ":0" for _str in _parse_array(flags.input_arrays)],
    outputs=[_str + ":0" for _str in _parse_array(flags.output_arrays)])
converter = tf.lite.TFLiteConverter.from_concrete_functions([wrap_func])

It works well. But, is there any method to set input shapes from users? It's very hard to find this:(

like tf.compat.v1.lite.TFLiteConverter.from_frozen_graph method in 1.X version.

def from_frozen_graph(cls,
                        graph_def_file,
                        input_arrays,
                        output_arrays,
                        input_shapes=None)

I found something similar like

# Set the correct data type and shape; shape can be (None, 224, 224, 3) also
new_placeholder = tf.placeholder(tf.float32, shape=(1, 224, 224, 3), name='inputs_new_name') 
# here you need to state the name of the placeholder you used in your original input placeholder  

saver = tf.import_graph_def(path/to/.meta, input_map={"original_inputs_placeholder_name:0": new_placeholder})

But, this is very inconvenient to use.

I'm looking for similar features in 2.X converter, but I haven't found any:(

@mhs4670go mhs4670go added the TFLiteConverter For issues related to TFLite converter label Apr 9, 2020
@amahendrakar amahendrakar added comp:lite TF Lite related issues TF 2.0 Issues relating to TensorFlow 2.0 type:support Support issues labels Apr 13, 2020
@jvishnuvardhan
Copy link
Contributor

jvishnuvardhan commented May 15, 2020

@mhs4670go Yes. there is an option to change size during inference. Please check this comment. Please post a standalone code to reproduce your issue. Thanks!

@jvishnuvardhan jvishnuvardhan added the stat:awaiting response Status - Awaiting response from author label May 15, 2020
@mhs4670go
Copy link
Author

@jvishnuvardhan Thank you for your comment. Changing size during inference might be enough, But, actually, I want to know if I can convert .pb file with None shape to .tflite file with user specified shape not None shape like v1 converter as I said above.

# This is v1 converter having `input_shape` option
input_shapes = None
  if flags.input_shapes: # By this line, `pb` with None shape can be converted to `.tflite` with known shape. 
                                     # Does v2 converter have similar method or sth? 
    input_shapes_list = [
        _parse_array(shape, type_fn=int)
        for shape in six.ensure_str(flags.input_shapes).split(":")
    ]
    input_shapes = dict(list(zip(input_arrays, input_shapes_list)))

@tensorflowbutler tensorflowbutler removed the stat:awaiting response Status - Awaiting response from author label May 18, 2020
@jvishnuvardhan jvishnuvardhan added the stat:awaiting tensorflower Status - Awaiting response from tensorflower label May 19, 2020
@MeghnaNatraj
Copy link
Member

MeghnaNatraj commented Jun 11, 2020

The link you are referring to discusses how a frozen graph (in TensorFlow 1 model format) can be converted to a TensorFlow 2 model: https://www.tensorflow.org/guide/migrate#a_graphpb_or_graphpbtxt. This is not required.

To convert the frozen graph model to a TensorFlow Lite model:

  1. If you are using TensorFlow 1, the API is tf.lite.TFLiteConverter.from_frozen_graph
  2. If you are using TensorFlow 2, the API is tf.compat.v1.lite.TFLiteConverter.from_frozen_graph
# TensorFlow Version 2.2
import tensorflow as tf
converter = tf.compat.v1.lite.TFLiteConverter.from_frozen_graph(
    graph_def_file='....path/to/frozen_graph.pb', 
    input_arrays=...,
    output_arrays=....,
    input_shapes={'...' : [_, _,....]}
)
tflite_model = converter.convert()

tflite_model_size = open('model.tflite', 'wb').write(tflite_model)
print('TFLite Model is %d bytes' % tflite_model_size)

The .from_frozen_graph API can be defined this way and the attributes which can be added are here. To find the names of these arrays, visualize the .pb file in Netron

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
comp:lite TF Lite related issues stat:awaiting tensorflower Status - Awaiting response from tensorflower TF 2.0 Issues relating to TensorFlow 2.0 TFLiteConverter For issues related to TFLite converter type:support Support issues
Projects
None yet
Development

No branches or pull requests

5 participants