Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

ServingInputReceiver passes Estimator model_fn only dictionary of features, but model_fn is allowed to take single tensor feature #11674

Closed
zomglings opened this issue Jul 21, 2017 · 18 comments
Assignees
Labels
type:feature Feature requests

Comments

@zomglings
Copy link
Contributor

zomglings commented Jul 21, 2017

System information

  • Have I written custom code (as opposed to using a stock example script provided in TensorFlow):
    Yes, I have written custom code. I have more discussion about the bug in this Jupyter notebook: https://gist.github.com/nkashy1/fc1ec4ee218963216dea3ab5242bf611

  • OS Platform and Distribution (e.g., Linux Ubuntu 16.04):
    Goobuntu

  • TensorFlow installed from (source or binary):
    PyPI

  • TensorFlow version (use command below):
    1.2.1

  • Python version:
    2.7.6

  • Bazel version (if compiling from source):
    Not relevant

  • CUDA/cuDNN version:
    Not relevant

  • GPU model and memory:
    Not relevant

  • Exact command to reproduce:
    Check this notebook: https://gist.github.com/nkashy1/fc1ec4ee218963216dea3ab5242bf611

Describe the problem

The tf.estimator.Estimator interface allows users to provide a model_fn which accepts features either within a single tensor or within a dictionary mapping strings to tensors.

The Estimator export_savedmodel method requires a serving_input_receiver_fn argument, which is a function of no arguments that produces a ServingInputReceiver. The features tensors from this ServingInputReceiver are passed to the model_fn for serving.

Upon instantiation, the ServingInputReceiver wraps single tensor features into a dictionary. This raises an error for estimators whose model_fn expects a single tensor as its features argument.

Source code / logs

Gist: https://gist.github.com/nkashy1/fc1ec4ee218963216dea3ab5242bf611

You can run that notebook to see log messages, etc.

Misc

Possibly related to this stackoverflow thread: https://stackoverflow.com/questions/42835809/how-to-export-estimator-model-with-export-savedmodel-function

@zomglings
Copy link
Contributor Author

zomglings commented Jul 21, 2017

Incidentally, there is no type check to ensure that the output of serving_input_receiver_fn is a ServingInputReceiver.

(Not that there should be one. It would be pretty annoying.)

@aselle
Copy link
Contributor

aselle commented Jul 26, 2017

@fchollet, @martinwicke, do you have any thoughts on how this should be handled?

@aselle aselle added type:feature Feature requests stat:awaiting response Status - Awaiting response from author stat:awaiting tensorflower Status - Awaiting response from tensorflower and removed stat:awaiting response Status - Awaiting response from author labels Jul 26, 2017
@ghost
Copy link

ghost commented Jul 28, 2017

@nkashy1

From my understanding of export_savedmodel on custom models you will need to work with feature dicts for the export to work.

I came across this same issue as I just pass tensors to the model. So what I did was check to see what type of features are being passed in (straight tensors or dict of features). If it is just tensors, do the normal code. If it is the features dict, then I convert it to tensors using input_from_feature_columns.

logits = features
if type(features) is dict:
            input_layer = tensorflow.contrib.layers.input_from_feature_columns(features, self.feature_columns)
            logits = input_layer

@zomglings
Copy link
Contributor Author

@michaelpetruzzellocivicom : That's right, you need feature dictionaries in your custom estimator as ServingInputReceiver is implemented, but the documentation for input_fn + model_fn claims that it can be bare tensors or feature dictionaries.

Although one can convert to single tensor the way you did above, and also directly, the point is that the API is inconsistent. Either the API needs to be changed (so that input_fn and model_fn are no longer expected to take bare tensors, this is mostly a documentation change at this point, but an important design decision) or the implementation of ServingInputReceiver needs to be changed.

@martinwicke
Copy link
Member

@davidsoergel Can you comment on the ServingInputReceiver issue?

@martin-gorner
Copy link

martin-gorner commented Aug 25, 2017

I confirm this is annoying.
The previous contract was clear. Whatever the serving_input_fn function returns as "features" is the same thing that model_fn will accept as "features". Now there is this additional constraint for the features to be a dictionary. This is unexpected, undocumented and will make a lot of people stumble.

It is also unnecessary. Yes, you need to name your things properly in communication protocols but the serving_input_fn function is there precisely so that you can unpackage your data from the comms protocol and transform it it the "features" that your model understands.

The two lines where this now break seem to have been written recently and neither needs the features to be a dictionary. The breakage seems to be unintentional:
(saved_model_export_utils.py L163) tests for “if features” which fails on a Tensor.
(estimator.py L1269) accesses features.keys which fails.

@tensorflowbutler
Copy link
Member

It has been 14 days with no activity and this issue has an assignee.Please update the label and/or status accordingly.

@martinwicke
Copy link
Member

@karmel This is another issue you could look at.

@tensorflowbutler
Copy link
Member

Nagging Assignee: It has been 14 days with no activity and this issue has an assignee. Please update the label and/or status accordingly.

@martinwicke
Copy link
Member

@karmel did you fix this? Should I close it?

@tensorflowbutler tensorflowbutler removed the stat:awaiting tensorflower Status - Awaiting response from tensorflower label Jan 23, 2018
@tensorflowbutler
Copy link
Member

A member of the TensorFlow organization has replied after the stat:awaiting tensorflower label was applied.

@tensorflowbutler
Copy link
Member

Nagging Assignee: It has been 14 days with no activity and this issue has an assignee. Please update the label and/or status accordingly.

@karmel karmel assigned karmel and unassigned davidsoergel Feb 7, 2018
@tensorflowbutler
Copy link
Member

Nagging Assignee: It has been 14 days with no activity and this issue has an assignee. Please update the label and/or status accordingly.

@karmel
Copy link

karmel commented Feb 22, 2018

We have a fix in review right now; will update this thread when that is ready.

@fchollet
Copy link
Member

I was actually looking into this feature right now. Where can I check out the fix?

jhseu pushed a commit to jhseu/tensorflow that referenced this issue Mar 4, 2018
…ss raw

tensors to model functions. Addresses tensorflow#11674.

PiperOrigin-RevId: 187552824
@tensorflowbutler
Copy link
Member

Nagging Assignee @karmel: It has been 14 days with no activity and this issue has an assignee. Please update the label and/or status accordingly.

@karmel
Copy link

karmel commented Mar 9, 2018

We have added a TensorServingInputReceiver that can accept and pass along raw tensors. You can use this class instead of the normal ServingInputReceiver if you would like to avoid having your model_fn input wrapped in a dictionary. The tests include an example of using this class with estimators and export_savedmodel.

This is scheduled to be released with v1.7. LMK if you have any questions, and thanks @nkashy1 for the report and samples.

@iteal
Copy link

iteal commented Mar 11, 2019

This is not very well documented, I only saw ServingInputReceiver in the doc and got the problem that my features were not a dict.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
type:feature Feature requests
Projects
None yet
Development

No branches or pull requests

10 participants