Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Transform ONNX to TF SavedModel #490

Closed
wjiuhe opened this issue Aug 27, 2019 · 18 comments · Fixed by #603
Closed

Transform ONNX to TF SavedModel #490

wjiuhe opened this issue Aug 27, 2019 · 18 comments · Fixed by #603

Comments

@wjiuhe
Copy link

wjiuhe commented Aug 27, 2019

Currently, SavedModel format is commonly used in Tensorflow, especially in Tensorflow Serving.

https://www.tensorflow.org/beta/guide/saved_model

Will you think about supporting tranforming ONNX to TF SavedModel?

I tried out a piece of codes based on existing codes, although the implementation is far from good, it worked in my scenario.

It will be convenient if the transformation to SavedModel is implemented, for example:

from onnx_tf.backend import toSavedModel
model = onnx.load('mnist.onnx')
export_path = './'
toSavedModel(model, export_path)

FYI, below is the existing codes I use as the reference and the piece of codes I've tested.

def run(self, inputs, **kwargs):

import tensorflow as tf
import onnx
from onnx_tf.backend import prepare
import tempfile

# Setup the export path
MODEL_DIR = tempfile.gettempdir()
version = 1
export_path = os.path.join(MODEL_DIR, str(version))
print(f'export_path = {export_path}')

if os.path.isdir(export_path):
    print('Already saved a model, cleaning up')
    !rm -r {export_path}

# Load the ONNX file
model = onnx.load('mnist.onnx')

# Import the ONNX model to Tensorflow
tf_rep = prepare(model)
inputs = np.asarray(img, dtype=np.float32)[np.newaxis, np.newaxis, :, :]
input_tensor = tf_rep.tensor_dict[tf_rep.inputs[0]]
output_tensor = tf_rep.tensor_dict[tf_rep.outputs[0]]

# Save the model as SavedModel
with tf_rep.graph.as_default():
    with tf.Session() as sess:
        if isinstance(inputs, dict):
            feed_dict = inputs
        elif isinstance(inputs, list) or isinstance(inputs, tuple):
            if len(tf_rep.inputs) != len(inputs):
                raise RuntimeError(
                    'Expected {} values for uninitialized '
                    'graph inputs ({}), but got {}.'.format(
                        len(tf_rep.inputs),
                        ', '.join(tf_rep.inputs),
                        len(inputs))
                    )
            feed_dict = dict(zip(tf_rep.inputs, inputs))
        else:
            # single input
            feed_dict = dict([(tf_rep.inputs[0], inputs)])

        feed_dict = {
                tf_rep.tensor_dict[key]: feed_dict[key] for key in tf_rep.inputs
        }

        sess.run(tf.global_variables_initializer())
        outputs = [tf_rep.tensor_dict[output] for output in tf_rep.outputs]
        tf.saved_model.simple_save(
            sess,
            export_path,
            inputs={'input_image': input_tensor},
            outputs={output_tensor.name: output_tensor}
        )
@chinhuang007
Copy link
Collaborator

I also used tf.saved_model.simple_save to create a SavedModel in my prototype code for training. So I agree producing such models is a nice addition to the onnx-tf converter. One thing however, the simple_save function is deprecated, https://www.tensorflow.org/api_docs/python/tf/saved_model/simple_save. Since TF 2.0 should be released pretty soon, I would suggest to look into the new APIs or some other ways to create the SavedModel files.

@wjiuhe
Copy link
Author

wjiuhe commented Aug 28, 2019

I also used tf.saved_model.simple_save to create a SavedModel in my prototype code for training. So I agree producing such models is a nice addition to the onnx-tf converter. One thing however, the simple_save function is deprecated, https://www.tensorflow.org/api_docs/python/tf/saved_model/simple_save. Since TF 2.0 should be released pretty soon, I would suggest to look into the new APIs or some other ways to create the SavedModel files.

After a peek at tf2.0 codes, it seems tf.saved_model.save() is implemented.

https://github.com/tensorflow/tensorflow/blob/aac18a3ffb2b9744f026eda4970369d2cf548980/tensorflow/python/saved_model/saved_model.py#L33

and the codes can be found here

https://github.com/tensorflow/tensorflow/blob/aac18a3ffb2b9744f026eda4970369d2cf548980/tensorflow/python/saved_model/save.py#L675

I will try out the Python API later and post the results.

@wjiuhe
Copy link
Author

wjiuhe commented Aug 30, 2019

I've tested the API on my own trained model

tf.saved_model.save(model, "./model_path/")

which works as expected.
However, onnx_tf doesn't support TF2.0 yet. e.g. #454
For now, I haven't test against ONNX pb.

@chinhuang007
Copy link
Collaborator

Yeah, we plan to support TF 2.0 in our 1.6 release (not in the immediate 1.5 release).

@romeolandry
Copy link

Thank you for your work!
are they solutions for tensorflow2.0: i still have this issue module 'tensorflow' has no attribute 'ceil', howewer i clone the master branch of PR.

@chinhuang007
Copy link
Collaborator

This PR, #531, is supposed to address the TF 2.0 API issues. Please check it out and provide comments as working or not for you. Save as a SavedModel will be looked at next.

@stev3
Copy link

stev3 commented Jan 20, 2020

Any update on this issue? I agree with @wjiuhe - this would be a nice feature.

@chinhuang007
Copy link
Collaborator

We are looking into export onnx models into SavedModel format, along with a few other general requirements and fixes. Will certainly give a high priority and provide updates soon.

@chinhuang007
Copy link
Collaborator

chinhuang007 commented Jan 31, 2020

Quick update after first round of investigation, TF 2.0 seems to promote tf.function, eager mode, and SavedModel, while discourage direct use of tf.Graph [https://www.tensorflow.org/api_docs/python/tf/Graph#using_graphs_directly_deprecated] so I think we should eventually move away from operating directly on the graph. Use of tf.Module, which is supported in saved_model APIs, is what I have in mind. Since it's going to be a major overhaul of our code base, I would like to know whether there are better/easier ways to be aligned with TF 2.0 and future versions, where deprecated APIs might be completely removed. Comments and recommendations are welcome.

@domino14
Copy link

domino14 commented Mar 6, 2020

With this TF2.0 API, does anyone know how we can export to SavedModel - is there a sample script someone has written for this? I know onnx-tf doesn't support it yet, but if anyone knows if there's a way to do it please let me know.

@chinhuang007
Copy link
Collaborator

We already have a prototype using tf.Module. Hope able to make it production soon.

@domino14
Copy link

domino14 commented Mar 6, 2020

do you have example code?

@chinhuang007
Copy link
Collaborator

chinhuang007 commented Mar 9, 2020

A simple code snippet:

import numpy as np
import tensorflow as tf

dtype = tf.float32
dtype2 = tf.int32
to_dtype = tf.int64
path = './add_cast_saved_model'

class OnnxTfBackend(tf.Module):
def init(self, name=None):
super()

class CastHandler(OnnxTfBackend):
def init(self):
super(CastHandler, self).init()

@tf.function
def call(self, x, y):
z = tf.cast(x, y)
return z

class AddHandler(OnnxTfBackend):
@tf.function
def call(self, x, y):
z = x + y
return z

class GraphHandler(OnnxTfBackend):
@tf.function
def call(self, x, y):
z = to_dtype
m1 = AddHandler()
m2 = CastHandler()
a = m1(x, y)
b = m2(a, z)

return b

##########################################
m=GraphHandler()
for d in [dtype, dtype2]:
signatures = dict()
signatures['x'] = tf.TensorSpec(None, d)
signatures['y'] = tf.TensorSpec(None, d)
m.call.get_concrete_function(**signatures)

tf.saved_model.save(m, path)

m2 = tf.saved_model.load(path)

print(m2(tf.constant([1]), tf.constant([2])))
print(m2(tf.constant([1., 2.]), tf.constant([3., 4.])))

@Siri-KA
Copy link

Siri-KA commented Apr 3, 2020

it will work for tensorflow version 1.15.0

@FlorentijnD
Copy link

Any updates on the progress?

@jaggernaut007
Copy link

A simple code snippet:

import numpy as np
import tensorflow as tf

dtype = tf.float32
dtype2 = tf.int32
to_dtype = tf.int64
path = './add_cast_saved_model'

class OnnxTfBackend(tf.Module):
def init(self, name=None):
super()

class CastHandler(OnnxTfBackend):
def init(self):
super(CastHandler, self).init()

@tf.function
def call(self, x, y):
z = tf.cast(x, y)
return z

class AddHandler(OnnxTfBackend):
@tf.function
def call(self, x, y):
z = x + y
return z

class GraphHandler(OnnxTfBackend):
@tf.function
def call(self, x, y):
z = to_dtype
m1 = AddHandler()
m2 = CastHandler()
a = m1(x, y)
b = m2(a, z)

return b

##########################################
m=GraphHandler()
for d in [dtype, dtype2]:
signatures = dict()
signatures['x'] = tf.TensorSpec(None, d)
signatures['y'] = tf.TensorSpec(None, d)
m.call.get_concrete_function(**signatures)

tf.saved_model.save(m, path)

m2 = tf.saved_model.load(path)

print(m2(tf.constant([1]), tf.constant([2])))
print(m2(tf.constant([1., 2.]), tf.constant([3., 4.])))

Looking for this update! Meanwhile. This prototype code works?

@gigadeplex
Copy link

any update, I really need this now

@chinhuang007 chinhuang007 linked a pull request May 28, 2020 that will close this issue
@AlexeyAB
Copy link

AlexeyAB commented Jul 19, 2020

@chinhuang007 Hi,

Can we expect TF2-saved_models.pb support during the month?
As I understand, will this PR solve this issue? #603

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging a pull request may close this issue.

10 participants