-
Notifications
You must be signed in to change notification settings - Fork 2.2k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Exporting contrib.learn models for tensorflow_serving #228
Comments
Is there no help for this one? What I am struggling with is to transform feature columns as given to the tf.contrib.learn.DNNLinearCombinedClassifier init function when creating the estimator and provided features as returned by the training input_fn into the parameters required by the estimator.export function and to do the analogous task inside the serving client. I tried to extract this data by running the ops returned by layers.input_from_feature_columns just like composable_model.build_model creates the inputs for the network. But I run into dependencies that I have not figured out yet. Concretely, in my efforts to figure this out, I temporarily stored the list of tensors returned by layers.input_from_feature_columns inside composable_model.build_model in a global var (feature_column_ops.stored_tensors) to run them later. this This list looks like this:
But when I try to run these ops:
I get the following error:
|
@fventer - +1, I would like to know the answer as well. |
Well, I also want to use TensorFlow Serving, by reading more codes in
|
+1 |
+1 as well. Need more documentation on the interplay between canned estimators and tensorflow serving. Thanks! |
Hi all
I could get a canned estimator to export, using the new estimator
export_savedmodel method, but I am still figuring out what exactly the keys
of the signatures should be in the client. tf serving running the exported
model gives me an error: (code=StatusCode.NOT_FOUND, details="FeedInputs:
unable to find feed output inputs") in the rpc response. I am focusing on
other work now, but as soon as I get time, I will come back to this problem
and let you know when I got it figured out.
However, https://github.com/nubbel has done this successfully using a ruby
client. If someone can port this his ruby code to python, please post your
solution!
If we can get an additional tutorial in the tf serving documentation based
on a canned estimator like the deep+wide one, life would be so much easier!
Regards
Fritz Venter
fritz.venter@gmail.com
cell: +15122937896
…On Fri, Mar 17, 2017 at 7:57 AM, Christian Schneider < ***@***.***> wrote:
+1 as well. Need more documentation on the interplay between canned
estimators and tensorflow serving. Thanks!
—
You are receiving this because you were mentioned.
Reply to this email directly, view it on GitHub
<#228 (comment)>,
or mute the thread
<https://github.com/notifications/unsubscribe-auth/AAwRcvyDpLgL-3dS_8F8lTxmbr4TcTEcks5rmoNEgaJpZM4KlpAt>
.
|
Any feedback on this from @tensorflow_serving team? This is important, no one fully can explain how to use estimators |
.export() uses SessionBundle (deprecated export format); .export_savedmodel() is the recommended way to save a SavedModel. For SavedModel related documentation, please see: https://github.com/tensorflow/tensorflow/blob/master/tensorflow/python/saved_model/README.md Also, for documentation related to setting up SignatureDefs for TensorFlow Serving, please see: https://github.com/tensorflow/serving/blob/master/tensorflow_serving/g3doc/signature_defs.md @davidsoergel, can you add some details too? Thanks! |
@sukritiramesh Thanks a lot. I actually figured out how to serve tf.learn models (LinearRegressor, etc.). If you guys are busy, I can easily contribute a tutorial on how to serve these types of models, e.g. maybe as an extension to tensorflow tf.learn tutorials (Linear or Wide&Deep). |
Yes please!! Been looking for an answer to this for a while |
I have been trying to figure this out for a week now, @MtDersvan do you mind sharing how to accomplished this? |
Yes, I'll write a little tutorial this week. |
@MtDersvan, I'm waiting for your tutorial as well :) |
Ok, guys. @sukritiramesh if this is good and proper enough I can send a PR (at least the serving part), if not I will be glad to hear any feedback on how to improve it. Known issues: Also, if you are using the latest release, you will probably encounter this tensorflow/tensorflow#9436 and this #421 issues. They are fairly simple to resolve. Big cheers to @fventer, @dale-cooper and @nubbel for their insights. |
Thanks for putting together this tutorial, @MtDersvan! Looping in a few folks who can help review it: @wolffg @dr4b @davidsoergel. |
Hi @MtDersvan! |
@lirchi Yes, it was a typo, you need to use |
Thanks MtDersvan
This is long overdue!
On Thu, May 11, 2017 at 9:39 PM MtDersvan ***@***.***> wrote:
@lirchi <https://github.com/lirchi> Yes, it was a typo, you need to use
serving_input_fn instead of export_input_fn. Fixed that.
As for ReadBinaryProto, I never used it, so I don't know the cause of
your error. It might be an exporting issue (using new SavedModel instead of
old GraphDef).
It definitely works with the latest model_servers:tensorflow_model_server.
I think someone from the team or stackoverflow can give a better answer.
—
You are receiving this because you were mentioned.
Reply to this email directly, view it on GitHub
<#228 (comment)>,
or mute the thread
<https://github.com/notifications/unsubscribe-auth/AAwRctm9KXNGf37rE6teuL03Uy16e3joks5r48ZvgaJpZM4KlpAt>
.
--
Regards
Fritz Venter
fritz.venter@gmail.com
cell: +15122937896
|
@MtDersvan -- BTW, it is possible to load the model from C++ using |
It looks like your tutorial doesn't work anymore with tf 1.3 @MtDersvan |
@semicolo I see, I'll take time during next several days to update it to 1.3. |
@MtDersvan I think there's a mixup between tensorflow/python/feature_column/feature_column.py and tensorflow/contrib/layers/python/layers/feature_column.py. |
@semicolo I pushed the update. |
Also, I think an official extended tutorial is slowly getting it's way into the docs - Exporting a Trained Model for Serving. |
on android phone Xiaomi 5s , android 7.1 , ReadBinaryProto crashes here too: it's wired that other android phone don't crashes here. @fventer : if u solved this issue , pls tell me how , thks~ |
@fventer there was a misunderstanding here, i took a broken pb file to ReadBinaryProto so it crashed. sorry for disturbing~ |
Closing due to staleness. If this is still an issue, please file a new updated issue with current steps to reproduce the bug. If this is a question, please ask it on: https://stackoverflow.com/questions/tagged/tensorflow-serving Thanks! |
@MtDersvan Thanks so much for your work and the tutorial. It's super appreciated. =) 👍👍👍💯 |
Is it possible that a tensorflow serving tutorial based on one of the contrib.learn estimators can be added? It would be really nice if the tutorial shows how to export the deep-and-wide estimator from this tutorial: https://www.tensorflow.org/versions/r0.11/tutorials/wide_and_deep/index.html
If this is not possible in the short term, I would appreciate some help:
I have trouble deriving a deep-and-wide estimator for tensorflow_serving based on the provided tensorflow serving examples.
My estimator takes several input features and predicts 1 or 0. I have a mixture of numeric and categorical (sparse tensor) input features. However the mnist and inception tutorials only show how to export a model for tensor flow serving that takes a single input 'x' (an image) and produces an output 'y'. How do I create a named_graph_signatures structure from a dictionary of possibly multiple types of input values?
The text was updated successfully, but these errors were encountered: