Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

DNNClassifier estimator cannot be exported #12508

Closed
mvsusp opened this issue Aug 23, 2017 · 25 comments
Closed

DNNClassifier estimator cannot be exported #12508

mvsusp opened this issue Aug 23, 2017 · 25 comments

Comments

@mvsusp
Copy link

mvsusp commented Aug 23, 2017

Please go to Stack Overflow for help and support:

https://stackoverflow.com/questions/tagged/tensorflow

If you open a GitHub issue, here is our policy:

  1. It must be a bug or a feature request.
  2. The form below must be filled out.
  3. It shouldn't be a TensorBoard issue. Those go here.

Here's why we have that policy: TensorFlow developers respond to issues. We want to focus on work that benefits the whole community, e.g., fixing bugs and adding features. Support only helps individuals. GitHub also notifies thousands of people when issues are filed. We want them to see you communicating an interesting problem, rather than being redirected to Stack Overflow.


System information

  • tensorflow/tensorflow:latest container
  • ubuntu linux
  • installed from pip
  • TensorFlow version 'v1.2.0-5-g435cdfc', '1.2.1':
  • Python 2.7:
  • Bazel version (if compiling from source):

Exact command to reproduce

classifier = DNNClassifier(feature_columns=feature_columns,
                         hidden_units=[10, 20, 10],
                         n_classes=3,
                         model_dir=model_path)

classifier.export_savedmodel(MODEL_PATH, script.serving_input_receiver_fn)

Describe the problem

Trying to export the model DNNClassifier throws the exception:

Exception during training: A default input_alternative must be provided.
 Traceback (most recent call last):
  File "algo.py", line 78, in train
    nn.export_savedmodel(MODEL_PATH, script.serving_input_receiver_fn, default_output_alternative_key=None)
  File "/usr/local/lib/python2.7/dist-packages/tensorflow/contrib/learn/python/learn/estimators/estimator.py", line 1280, in export_savedmodel
    actual_default_output_alternative_key)
  File "/usr/local/lib/python2.7/dist-packages/tensorflow/contrib/learn/python/learn/utils/saved_model_export_utils.py", line 259, in build_all_signature_defs
    raise ValueError('A default input_alternative must be provided.')

The problem happens because DNNClassifier constructor creates a head with name None: https://github.com/tensorflow/tensorflow/blob/master/tensorflow/contrib/learn/python/learn/estimators/dnn.py#L365

@facaiy
Copy link
Member

facaiy commented Aug 26, 2017

Hi, @mvsusp . Could you give a minimal reproducible test case? Something seems wrong with your serving_input_fn.

@mvsusp
Copy link
Author

mvsusp commented Aug 26, 2017

Hi, @facaiy. Here is a minimal reproducible test case:

from __future__ import absolute_import
from __future__ import division
from __future__ import print_function

import os
import urllib

import numpy as np
import tensorflow as tf

# Data sets
from tensorflow.contrib.learn import DNNClassifier

IRIS_TRAINING = "iris_training.csv"
IRIS_TRAINING_URL = "http://download.tensorflow.org/data/iris_training.csv"

IRIS_TEST = "iris_test.csv"
IRIS_TEST_URL = "http://download.tensorflow.org/data/iris_test.csv"


def main():
    # If the training and test sets aren't stored locally, download them.
    if not os.path.exists(IRIS_TRAINING):
        raw = urllib.urlopen(IRIS_TRAINING_URL).read()
        with open(IRIS_TRAINING, "w") as f:
            f.write(raw)

    if not os.path.exists(IRIS_TEST):
        raw = urllib.urlopen(IRIS_TEST_URL).read()
        with open(IRIS_TEST, "w") as f:
            f.write(raw)

    # Load datasets.
    training_set = tf.contrib.learn.datasets.base.load_csv_with_header(
        filename=IRIS_TRAINING,
        target_dtype=np.int,
        features_dtype=np.float32)

    # Specify that all features have real-value data
    feature_columns = [tf.feature_column.numeric_column("x", shape=[4])]

    # Build 3 layer DNN with 10, 20, 10 units respectively.
    classifier = DNNClassifier(feature_columns=feature_columns,
                               hidden_units=[10, 20, 10],
                               n_classes=3,
                               model_dir="/tmp/iris_model")
    # Define the training inputs
    train_input_fn = tf.estimator.inputs.numpy_input_fn(
        x={"x": np.array(training_set.data)},
        y=np.array(training_set.target),
        num_epochs=None,
        shuffle=True)

    # Train model.
    classifier.fit(input_fn=train_input_fn, steps=2000)

    def serving_input_fn():
        inputs = {'x': tf.placeholder(tf.float32, [4])}
        return tf.estimator.export.ServingInputReceiver(inputs, inputs)

    classifier.export_savedmodel(export_dir_base="/tmp/iris_model", serving_input_fn=serving_input_fn)

if __name__ == "__main__":
    main()

The error is:

Traceback (most recent call last):
  File "/reproducible_example.py", line 64, in <module>
    main()
  File "/reproducible_example.py", line 61, in main
    classifier.export_savedmodel(export_dir_base="/tmp/iris_model", serving_input_fn=serving_input_fn)
  File "/Users/mvs/python2.7/lib/python2.7/site-packages/tensorflow/contrib/learn/python/learn/estimators/estimator.py", line 1280, in export_savedmodel
    actual_default_output_alternative_key)
  File "/Users/mvs/python2.7/lib/python2.7/site-packages/tensorflow/contrib/learn/python/learn/utils/saved_model_export_utils.py", line 259, in build_all_signature_defs
    raise ValueError('A default input_alternative must be provided.')
ValueError: A default input_alternative must be provided.

@facaiy
Copy link
Member

facaiy commented Aug 26, 2017

Thanks for your test case, @mvsusp . It's really concise.

If I understand correctly, learn.DNNClassifier expects an InputFnOps, not ServingInputReceiver. Hence, perhaps you'd better to try build InputFnOps by youself or use tf.contrib.learn.build_parsing_serving_input_fn for tf.Example, more to see contrib.learn#Input_processing.

Correct me if I'm wrong, I believe that ServingInputReceiver is prepared for tf.Estimator.DNNClassifier, which is introduced later in 1.3, not 1.2.1. So, to upgrade your tensorflow is also a choice.

By the way, tf.Estimator and tf.contrib.learn are different modules and might be incompatible.

@facaiy
Copy link
Member

facaiy commented Aug 26, 2017

Unfortunately, I don't know yet. Perhaps API is the most reliable source, except source code itself.

Thanks, happy weekend, @mvsusp .

@mvsusp
Copy link
Author

mvsusp commented Aug 27, 2017

@facaiy following your suggestion I got stuck if another issue: I got a Classification input must be a single string Tensor; got {'x': <tf.Tensor 'Placeholder:0' shape=(4,) dtype=float32>}

It seems that I cannot feed a float tensor for classification using tf serving?

My code:

from __future__ import absolute_import
from __future__ import division
from __future__ import print_function

import os
import urllib

import numpy as np
import tensorflow as tf

# Data sets

IRIS_TRAINING = "iris_training.csv"
IRIS_TRAINING_URL = "http://download.tensorflow.org/data/iris_training.csv"

IRIS_TEST = "iris_test.csv"
IRIS_TEST_URL = "http://download.tensorflow.org/data/iris_test.csv"


def main():
    # If the training and test sets aren't stored locally, download them.
    if not os.path.exists(IRIS_TRAINING):
        raw = urllib.urlopen(IRIS_TRAINING_URL).read()
        with open(IRIS_TRAINING, "w") as f:
            f.write(raw)

    if not os.path.exists(IRIS_TEST):
        raw = urllib.urlopen(IRIS_TEST_URL).read()
        with open(IRIS_TEST, "w") as f:
            f.write(raw)

    # Load datasets.
    training_set = tf.contrib.learn.datasets.base.load_csv_with_header(
        filename=IRIS_TRAINING,
        target_dtype=np.int,
        features_dtype=np.float32)

    # Specify that all features have real-value data
    feature_columns = [tf.feature_column.numeric_column("x", shape=[4])]

    # Build 3 layer DNN with 10, 20, 10 units respectively.
    classifier = tf.estimator.DNNClassifier(feature_columns=feature_columns,
                               hidden_units=[10, 20, 10],
                               n_classes=3,
                               model_dir="/tmp/iris_model")
    # Define the training inputs
    train_input_fn = tf.estimator.inputs.numpy_input_fn(
        x={"x": np.array(training_set.data)},
        y=np.array(training_set.target),
        num_epochs=None,
        shuffle=True)

    # Train model.
    classifier.train(input_fn=train_input_fn, steps=2000)

    def serving_input_fn():
        inputs = {'x': tf.placeholder(tf.float32, [4])}
        return tf.estimator.export.ServingInputReceiver(inputs, inputs)

    classifier.export_savedmodel(export_dir_base="/tmp/iris_model", serving_input_receiver_fn=serving_input_fn)

if __name__ == "__main__":
    main()

How can I fix that? Thank you for all the support.

@facaiy
Copy link
Member

facaiy commented Aug 27, 2017

How about using build_raw_serving_input_receiver_fn ?

Perhaps tf.estimator.export will be useful for you. Good luck.

@mvsusp
Copy link
Author

mvsusp commented Aug 27, 2017 via email

@mvsusp
Copy link
Author

mvsusp commented Aug 28, 2017

Hi @facaiy

The issue was solved using tf.estimator.export.build_parsing_serving_input_receiver_fn.

Thank you!

@mvsusp mvsusp closed this as completed Aug 28, 2017
@samithaj
Copy link

@mvsusp Can you please post the fixed code for your example ,I'm trying tho export and serve a DNNLinearCombinedRegressor model , and i cant find any working example

@mvsusp
Copy link
Author

mvsusp commented Aug 28, 2017

Hello @samithaj

Canned estimator don't have a lot of documentation yet. Here it go my code:

INPUT_TENSOR_NAME = 'inputs'


def estimator(model_path):
    feature_columns = [tf.feature_column.numeric_column(INPUT_TENSOR_NAME, shape=[4])]
    return tf.estimator.DNNClassifier(feature_columns=feature_columns,
                                      hidden_units=[10, 20, 10],
                                      n_classes=3,
                                      model_dir=model_path)


def serving_input_receiver_fn():
    feature_spec = {INPUT_TENSOR_NAME: tf.FixedLenFeature(dtype=tf.float32, shape=[4])}
    return tf.estimator.export.build_parsing_serving_input_receiver_fn(feature_spec)()


def train_input_fn(training_dir):
    training_set = tf.contrib.learn.datasets.base.load_csv_with_header(
        filename=os.path.join(training_dir, 'iris_training.csv'),
        target_dtype=np.int,
        features_dtype=np.float32)

    return tf.estimator.inputs.numpy_input_fn(
        x={INPUT_TENSOR_NAME: np.array(training_set.data)},
        y=np.array(training_set.target),
        num_epochs=None,
        shuffle=True)()

Hope it helps you!

@facaiy
Copy link
Member

facaiy commented Aug 29, 2017

Cool, @mvsusp .

By the way, if tf.feature_column is used, feature_spec can be generated automatically, like:

    feature_spec = tf.feature_column.make_parse_example_spec(feature_columns)

@samithaj
Copy link

thanks @mvsusp
Check here anyone looking for complete example: https://github.com/MtDersvan/tf_playground

@AKhilGarg91
Copy link

AKhilGarg91 commented Nov 22, 2017

@mvsusp @MtDersvan this doesnt work in version 1.4.0

@MtDersvan
Copy link
Contributor

@AKhilGarg91 Do you have any tracebacks?

@AKhilGarg91
Copy link

@MtDersvan No when I run this file I got error during export_savedmodel that too many values to unpack.
https://github.com/AKhilGarg91/tf_playground/blob/master/wide_and_deep_tutorial/wide_and_deep_export_r1.3.ipynb

@AKhilGarg91
Copy link

@MtDersvan have u tried to run below command in window?
Serving is available for windows or not??
$>bazel build //tensorflow_serving/model_servers:tensorflow_model_server

@twksos
Copy link

twksos commented Dec 1, 2017

@AKhilGarg91

I got the too many values to unpack on 1.4, too.

a small dig into that I changed
tf.estimator.export.build_parsing_serving_input_receiver_fn
to
tensorflow.contrib.learn.build_parsing_serving_input_fn

In tensorflow/contrib/learn/python/learn/utils/saved_model_export_utils.py line 157-161

  if isinstance(input_ops, input_fn_utils.InputFnOps):    
    features, unused_labels, default_inputs = input_ops #<- should go here
    input_alternatives[DEFAULT_INPUT_ALTERNATIVE_KEY] = default_inputs
  else:
    features, unused_labels = input_ops #<- this line fails

In InputFnOps, it says

Contents of this file are moved to tensorflow/python/estimator/export.py.
InputFnOps is renamed to ServingInputReceiver.
build_parsing_serving_input_fn is renamed to
  build_parsing_serving_input_receiver_fn.
build_default_serving_input_fn is renamed to
  build_raw_serving_input_receiver_fn.

Seems the new class causes the error.

if isinstance(input_ops, input_fn_utils.InputFnOps):

should change to something like

if isinstance(input_ops, input_fn_utils.InputFnOps) or isinstance(input_ops, export.ServingInputReceiver):

My work around is to use tensorflow.contrib.learn.build_parsing_serving_input_fn instead for now.

@AKhilGarg91
Copy link

@twksos Thanks. . Yes that worked for me as well and I also tried that one only before.

@Anmol-Sharma
Copy link
Contributor

Anmol-Sharma commented Dec 6, 2017

hey everyone, i'm trying to export a CNN model which accepts 200x200 rgb images as inputs, however while exporting the model i'm getting the following error

---------------------------------------------------------------------------
ValueError                                Traceback (most recent call last)
<ipython-input-35-bdf37a7e35ac> in <module>()
----> 1 model.export_savedmodel(export_dir_base="/home/ubuntu/CNNexport/",serving_input_receiver_fn=serving_input_fn)

~/anaconda3/envs/tensorflow_p36/lib/python3.6/site-packages/tensorflow/python/estimator/estimator.py in export_savedmodel(self, export_dir_base, serving_input_receiver_fn, assets_extra, as_text, checkpoint_path)
    515           serving_input_receiver.receiver_tensors,
    516           estimator_spec.export_outputs,
--> 517           serving_input_receiver.receiver_tensors_alternatives)
    518 
    519       if not checkpoint_path:

~/anaconda3/envs/tensorflow_p36/lib/python3.6/site-packages/tensorflow/python/estimator/export/export.py in build_all_signature_defs(receiver_tensors, export_outputs, receiver_tensors_alternatives)
    191     receiver_tensors = {_SINGLE_RECEIVER_DEFAULT_NAME: receiver_tensors}
    192   if export_outputs is None or not isinstance(export_outputs, dict):
--> 193     raise ValueError('export_outputs must be a dict.')
    194 
    195   signature_def_map = {}

ValueError: export_outputs must be a dict.

Here is my feature spec and serving input fn definition :

feature_spec = {'images': tf.FixedLenFeature([200,200,3],tf.float32)}
def serving_input_fn():
    serialized_tf_example = tf.placeholder(dtype=tf.string,
                                         shape=[None],
                                         name='input_tensors')
    receiver_tensors = {'inputs': serialized_tf_example}
    features = tf.parse_example(serialized_tf_example, feature_spec)
    print(features)
    print(tf.estimator.export.ServingInputReceiver(features, receiver_tensors))
    return tf.estimator.export.ServingInputReceiver(features, receiver_tensors)

and call to export_savedmodel : model.export_savedmodel(export_dir_base="/home/ubuntu/CNNexport/",serving_input_receiver_fn=serving_input_fn)

I'm not sure whats causing export_outputs to be not a dict object. Can anyone help me figure out what i'm doing wrong?

@Anmol-Sharma
Copy link
Contributor

I have also tried using build_parsing_serving_input_receiver_fn and build_raw_serving_input_receiver_fn but at the end i always get the same error.

@MtDersvan
Copy link
Contributor

MtDersvan commented Dec 6, 2017

@AKhilGarg91 @twksos Thanks for the information, can you open an issue or a PR in the tutorial repo, so we can deal with it there? Also, @twksos it looks like a r1.4 backward compatibility issue as

tf.estimator.export.build_parsing_serving_input_receiver_fn()

is already in the core library, so technically it should work. Maybe a tf core team can verify this? Or else, a new issue might be needed to be created.
Does tensorflow.contrib.learn.build_parsing_serving_input_fn work adequately and as expected for you?

@felicitywang
Copy link

felicitywang commented Dec 11, 2017

@Anmol-Sharma I guess you need to return export_outputs in your model_fn, like

        return tf.estimator.EstimatorSpec(
            mode=mode,
            predictions=predictions,
            loss=loss,
            train_op=train_op,
            export_outputs=export_outputs)

@Anmol-Sharma
Copy link
Contributor

@felicitywang did it work for your use case?

@samithaj
Copy link

TensorServingInputReceiver that can accept and pass along raw tensors
#11674 (comment)

is out on TensorFlow 1.7.0-rc0

@balzer82
Copy link

@MtDersvan

Does tensorflow.contrib.learn.build_parsing_serving_input_fn work adequately and as expected for you?

Thanks! You are right, if I trained a model from contrib.learn (like DNNRegressor) I have to use the tf.contrib.learn.build_parsing_serving_input_fn.

Working example:

def serving_input_receiver_fn():
    feature_spec = tf.feature_column.make_parse_example_spec(feature_columns)
    return tf.contrib.learn.build_parsing_serving_input_fn(feature_spec)()

servable_model_dir = "./DNNRegressors/Servable/"

regressor.export_savedmodel(servable_model_dir, serving_input_receiver_fn)

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

9 participants