Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Can python client be independent of tensorflow ? #271

Closed
drcege opened this issue Dec 9, 2016 · 18 comments
Closed

Can python client be independent of tensorflow ? #271

drcege opened this issue Dec 9, 2016 · 18 comments

Comments

@drcege
Copy link

drcege commented Dec 9, 2016

In the original version of client https://github.com/tensorflow/serving/blob/e1c16e4859260fab02eab3af5414bf4b75a0b34f/tensorflow_serving/example/inception_client.py, tf was only used to handle command-line arguments. So I could write a python program that doesn't depend on tensoflow.

While, the new client example relies on tf to process image data:
https://github.com/tensorflow/serving/blob/master/tensorflow_serving/example/inception_client.py#49

request.inputs['images'].CopyFrom(
        tf.contrib.util.make_tensor_proto(data, shape=[1]))

And it imports predict_pb2, which also depends on tensorflow.

So how could I write a client that doesn't depend on tensorflow? Any example codes?

@nubbel
Copy link
Contributor

nubbel commented Dec 9, 2016

Hi @cegedr!

I built a ruby client for tensorflow/serving just by compiling the protos for ruby (i.e. no custom code).
You can find it here: https://github.com/nubbel/tensorflow_serving_client-ruby#usage

You should be able to do the same for python.

@kirilg
Copy link
Contributor

kirilg commented Dec 9, 2016

+1 to @nubbel's reply. The only TF Serving code that needs to be linked in is the predict API (proto files). This will require the proto definition of TensorProto as well which is in the TensorFlow repo. Other than that, you don't need any other TF code.

With a more involved change, you can define your own API to the model server (your own proto instead of predict_pb2) that doesn't depend on TF at all, and the client can use that.

@drcege
Copy link
Author

drcege commented Dec 10, 2016

@nubbel @kirilg Thanks! I'm not familiar with google protobuf, but I will try.

@sebastian-schlecht
Copy link

@cegedr I started working on this in Python - my implementation is yet very basic, and the code still uses the tf contrib stuff but in case you come up with a solution to this please let me know as well.

https://github.com/sebastian-schlecht/tensorflow-serving-python

@drcege
Copy link
Author

drcege commented Dec 11, 2016

@sebastian-schlecht If you follow the code of tf.contrib.util.make_tensor_proto, you can see that it actually uses numpy and some proto files (tensor_pb2, tensor_shape_pb2, ... ). So in fact, the client example can be independent of tensorflow, only that it's not so clear as before.

It would be better if the client example can be more concise and independent.

@simsicon
Copy link

I would love to see it's independent to request tensorflow models as clear and intuitive as possible.

Bazel, protobuf and grpc are nice, but I think a more light weight serving solution would be nice as well.

Any ideas? Thanks

@tobegit3hub
Copy link
Contributor

tobegit3hub commented Jan 10, 2017

We have implemented the dependent Python library for this in deep_recommend_system.

I'm not sure if this is contributions welcome. We may contribute the clients for TensorFlow Serving in different programming languages in this repository. Providing batteries-included clients is really helpful for our TensorFlow users.

We may submit the PR for this and hope it would help 😃

@tobegit3hub
Copy link
Contributor

Hi @cegedr , I have submit the PR about how to compile the independent python client in #290 .

You can follow the instructions to include the generated python files without bazel. This library have been used in our production environment and may need your requirements.

@drcege
Copy link
Author

drcege commented Jan 10, 2017

@tobegit3hub
Your implementation still used tensorflow? What I actually mean is no import tensorflow as tf and no tf.contrib.util.make_tensor_proto. In some cases, we do not want to install the entire tensorflow library.

@tobegit3hub
Copy link
Contributor

The only place to use tensorflow or tf is calling tf.contrib.util.make_tensor_proto. You can definitely construct TensorProto without this function, but you need to compile most of the proto files in TensorFlow repository.

Actually, we have to construct TensorProto by ourselves in Java. You can refer to this example code in InceptionPredictClient.java.

We can also generate all the proto files with protoc or grpcio. Anyway, I don't think it's necessary for Python.

@CrowbarKZ
Copy link

+1 to @simsicon 's idea of a lightweight client which could be used without tf dependencies. Would be great to have such an option

@stianlp
Copy link

stianlp commented Jul 4, 2017

Trying to write a lightweight client myself. Cloned this repo https://www.npmjs.com/package/tf-serving-nodejs-client and created the *_pb2_grpc.py and *_pb2.py files with
python -m grpc.tools.protoc -I./protos --python_out=. --grpc_python_out=. protos/*

I am getting an error after using tf.contrib.util.make_tensor_proto and calling MergeFrom with the return value as parameter.

Here's the stack trace:

Traceback (most recent call last): File "client.py", line 87, in <module> tf.app.run() File "/Users/slp/anaconda/envs/ml/lib/python3.5/site-packages/tensorflow/python/platform/app.py", line 48, in run _sys.exit(main(_sys.argv[:1] + flags_passthrough)) File "client.py", line 83, in main FLAGS.concurrency, FLAGS.num_tests) File "client.py", line 68, in do_inference request.inputs['images'].CopyFrom(tp) File "/Users/slp/anaconda/envs/ml/lib/python3.5/site-packages/google/protobuf/message.py", line 118, in CopyFrom self.MergeFrom(other_msg) File "/Users/slp/anaconda/envs/ml/lib/python3.5/site-packages/google/protobuf/internal/python_message.py", line 1209, in MergeFrom 'expected %s got %s.' % (cls.__name__, msg.__class__.__name__)) TypeError: Parameter to MergeFrom() must be instance of same class: expected TensorProto got TensorProto.

The problem is that MergeFrom expects <class 'tensorflow.core.framework.tensor_pb2.TensorProto'>, but gets the TensorProto I generated which is <class 'tensor_pb2.TensorProto'>

I have checked my .proto files, they don't differ from those used in tfserving and tensorflow. So the only issue seems to be that tf.contrib.util.make_tensor_proto returns a <class 'tensorflow.core.framework.tensor_pb2.TensorProto'> which is not the same type as <class 'tensor_pb2.TensorProto'>.

@tobegit3hub Haven't tried your implementation yet, but looks like your repo has a lot of tensorflow stuff in it. It's getting late here, I checking out your implementation tomorrow. Would be cool if we could implement a client without tf.

@tobegit3hub
Copy link
Contributor

Hi @stianlp , you can checkout the python_predict_client. There are generated gRPC files which are not dependent on TensorFlow.

Actually we use tf.contrib.util.make_tensor_proto to construct TensorProto objects easily. You can checkout its source code if you don't want to use it directly.

@stianlp
Copy link

stianlp commented Jul 6, 2017

Tested it last night, works like a charm :) Thank you 🥇

@stianlp
Copy link

stianlp commented Jan 28, 2018

I just wanted to check in here and tell you that I wrote a client that is not dependent on TensorFlow.

I also wrote up two blog posts on medium, one on how to save a model for tfserving and one about the client.

https://medium.com/@stianlindpetlund/tensorflow-serving-101-pt-1-a79726f7c103
https://medium.com/@stianlindpetlund/tensorflow-serving-101-pt-2-682eaf7469e7

Feel free to use the client :)

@nuchi
Copy link

nuchi commented Mar 25, 2018

Chiming in for anyone else following: I wrote a tool to generate a package of tf and tf-serving python protobuf interfaces. https://github.com/nuchi/tf_pb_without_tf

It generates a wheel which contains things like TensorProto, PredictionServiceStub, etc. It downloads TF and TF Serving to generate the wheel, but then you can install the result anywhere and don't need TF or TF serving installed.

@gautamvasudevan
Copy link
Collaborator

TF Serving now also supports a REST API, for which it's fairly easy to build a client around given the plethora of libraries supporting REST APIs across various languages.

@alexeckert
Copy link

Hi, I wouldn't say that the REST api is a good subsitute as grpc with tensors often offers performance characteristcs. It would be very convenient if tf.contrib.util.make_tensor_proto could live in its own package to avoid tf and all its dependencies.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests