Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Serving problem: "Op type not registered 'PyFunc'" #113

Closed
vaklyuenkov opened this issue May 31, 2017 · 6 comments
Closed

Serving problem: "Op type not registered 'PyFunc'" #113

vaklyuenkov opened this issue May 31, 2017 · 6 comments

Comments

@vaklyuenkov
Copy link

vaklyuenkov commented May 31, 2017

Hello!

I've successfully trained model and want to use TensorFlow Serving components to export a trained TensorFlow model and use the standard tensorflow_model_server to serve it. So, using this code I can export model to serving:

 net = resnetv1(batch_size=1, num_layers=101)
 net.create_architecture( "TEST", 2,
                          tag='default', anchor_scales=[8, 16, 32])

variables_to_restore = slim.get_variables_to_restore()
 init_fn = slim.assign_from_checkpoint_fn(checkpoint_path, variables_to_restore)

 saver = tf.train.Saver(sharded=True)
 model_exporter = exporter.Exporter(saver)

 with tf.Session() as sess:
       # for i in range(run_size):    
            init_fn(sess)
            print('Exporting trained model to', export_path)
            model_exporter.init(
                sess.graph.as_graph_def(),
                named_graph_signatures={
                'inputs': exporter.generic_signature({'image': net._image, 'size': net._im_info }),
                'outputs': exporter.generic_signature({ 'scores' : net._predictions['cls_prob'],'bbox_pred':net._predictions['bbox_pred'] ,'rois':net._predictions['rois']})})

            model_exporter.export(export_path, tf.constant(export_version), sess)
            print('model ', export_version, ' exported')

But when I try to serve model, serving show me an error: "Op type not registered 'PyFunc'".

It is bacause:

anchors, anchor_length = tf.py_func(generate_anchors_pre,
                                          [height, width,
                                           self._feat_stride, self._anchor_scales, self._anchor_ratios],
                                          [tf.float32, tf.int32], name="generate_anchors"
 rois, rpn_scores = tf.py_func(proposal_layer,
                                    [rpn_cls_prob, rpn_bbox_pred, self._im_info, self._mode,
                                     self._feat_stride, self._anchors, self._num_anchors],
                                    [tf.float32, tf.float32])

How can rewrite this functions using tensorflow and python to use serving?

Gratefull for any thougths!

@endernewton
Copy link
Owner

i haven't tried serving for this code.. i have pyfunc wrapped in the code so probably this is causing the problem. closing as of now

@markusnagel
Copy link

Hi @vaklyuenkov,

I have this model running in TF serving. As you already pointed out the issue is the use of tf.py_func in some of the layers. TF serving is written in C++ and therefore does not support custom python layers. As far as I'm aware there is one way how to solve it, and that's by replacing all python layers with equivalent tensorflow operations or layers.

In my own fork I made a basic implementations of all layers required for inference. It also contains an example script (in tools/export_tf_serving.py) which should work out of the box if you have the demo running. I have all layer implementations (without the example) in a separate branch and plan to soon make a PR so it can be merged into the main repository. I hope that helps you.

@endernewton
Copy link
Owner

endernewton commented Jun 6, 2017 via email

@vaklyuenkov
Copy link
Author

@markusnagel thank you very much. Again sorry for spam.
It's awesome!

@vaklyuenkov
Copy link
Author

@markusnagel, would you implement this layers also for training?

@rishabhmalhotra
Copy link

Is there any parallel implementation/workaround this since I am interested in knowing whether I can deploy this using TF_Serving (and the tensorflow official docs say that it's not possible to serialize with py_func) ?

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

4 participants