-
Notifications
You must be signed in to change notification settings - Fork 74k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
ServingInputReceiver passes Estimator model_fn only dictionary of features, but model_fn is allowed to take single tensor feature #11674
Comments
Incidentally, there is no type check to ensure that the output of (Not that there should be one. It would be pretty annoying.) |
@fchollet, @martinwicke, do you have any thoughts on how this should be handled? |
@nkashy1 From my understanding of export_savedmodel on custom models you will need to work with feature dicts for the export to work. I came across this same issue as I just pass tensors to the model. So what I did was check to see what type of features are being passed in (straight tensors or dict of features). If it is just tensors, do the normal code. If it is the features dict, then I convert it to tensors using input_from_feature_columns.
|
@michaelpetruzzellocivicom : That's right, you need feature dictionaries in your custom estimator as Although one can convert to single tensor the way you did above, and also directly, the point is that the API is inconsistent. Either the API needs to be changed (so that |
@davidsoergel Can you comment on the ServingInputReceiver issue? |
I confirm this is annoying. It is also unnecessary. Yes, you need to name your things properly in communication protocols but the serving_input_fn function is there precisely so that you can unpackage your data from the comms protocol and transform it it the "features" that your model understands. The two lines where this now break seem to have been written recently and neither needs the features to be a dictionary. The breakage seems to be unintentional: |
It has been 14 days with no activity and this issue has an assignee.Please update the label and/or status accordingly. |
@karmel This is another issue you could look at. |
Nagging Assignee: It has been 14 days with no activity and this issue has an assignee. Please update the label and/or status accordingly. |
@karmel did you fix this? Should I close it? |
A member of the TensorFlow organization has replied after the stat:awaiting tensorflower label was applied. |
Nagging Assignee: It has been 14 days with no activity and this issue has an assignee. Please update the label and/or status accordingly. |
Nagging Assignee: It has been 14 days with no activity and this issue has an assignee. Please update the label and/or status accordingly. |
We have a fix in review right now; will update this thread when that is ready. |
I was actually looking into this feature right now. Where can I check out the fix? |
…ss raw tensors to model functions. Addresses tensorflow#11674. PiperOrigin-RevId: 187552824
Nagging Assignee @karmel: It has been 14 days with no activity and this issue has an assignee. Please update the label and/or status accordingly. |
We have added a TensorServingInputReceiver that can accept and pass along raw tensors. You can use this class instead of the normal ServingInputReceiver if you would like to avoid having your model_fn input wrapped in a dictionary. The tests include an example of using this class with estimators and export_savedmodel. This is scheduled to be released with v1.7. LMK if you have any questions, and thanks @nkashy1 for the report and samples. |
This is not very well documented, I only saw ServingInputReceiver in the doc and got the problem that my features were not a dict. |
System information
Have I written custom code (as opposed to using a stock example script provided in TensorFlow):
Yes, I have written custom code. I have more discussion about the bug in this Jupyter notebook: https://gist.github.com/nkashy1/fc1ec4ee218963216dea3ab5242bf611
OS Platform and Distribution (e.g., Linux Ubuntu 16.04):
Goobuntu
TensorFlow installed from (source or binary):
PyPI
TensorFlow version (use command below):
1.2.1
Python version:
2.7.6
Bazel version (if compiling from source):
Not relevant
CUDA/cuDNN version:
Not relevant
GPU model and memory:
Not relevant
Exact command to reproduce:
Check this notebook: https://gist.github.com/nkashy1/fc1ec4ee218963216dea3ab5242bf611
Describe the problem
The tf.estimator.Estimator interface allows users to provide a
model_fn
which accepts features either within a single tensor or within a dictionary mapping strings to tensors.The Estimator
export_savedmodel
method requires aserving_input_receiver_fn
argument, which is a function of no arguments that produces a ServingInputReceiver. The features tensors from thisServingInputReceiver
are passed to themodel_fn
for serving.Upon instantiation, the
ServingInputReceiver
wraps single tensor features into a dictionary. This raises an error for estimators whosemodel_fn
expects a single tensor as itsfeatures
argument.Source code / logs
Gist: https://gist.github.com/nkashy1/fc1ec4ee218963216dea3ab5242bf611
You can run that notebook to see log messages, etc.
Misc
Possibly related to this stackoverflow thread: https://stackoverflow.com/questions/42835809/how-to-export-estimator-model-with-export-savedmodel-function
The text was updated successfully, but these errors were encountered: