Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

how to save model for tensroflwo serving for lstm in tensorflow/contrib/timeseries/examples/lstm.py #16590

Closed
yangfengKAUST opened this issue Jan 30, 2018 · 9 comments
Assignees

Comments

@yangfengKAUST
Copy link

Please go to Stack Overflow for help and support:

https://stackoverflow.com/questions/tagged/tensorflow

If you open a GitHub issue, here is our policy:

  1. It must be a bug or a feature request.
  2. The form below must be filled out.
  3. It shouldn't be a TensorBoard issue. Those go here.

Here's why we have that policy: TensorFlow developers respond to issues. We want to focus on work that benefits the whole community, e.g., fixing bugs and adding features. Support only helps individuals. GitHub also notifies thousands of people when issues are filed. We want them to see you communicating an interesting problem, rather than being redirected to Stack Overflow.


System information

  • Have I written custom code (as opposed to using a stock example script provided in TensorFlow):
  • OS Platform and Distribution (e.g., Linux Ubuntu 16.04):
  • TensorFlow installed from (source or binary):
  • TensorFlow version (use command below):
  • Python version:
  • Bazel version (if compiling from source):
  • GCC/Compiler version (if compiling from source):
  • CUDA/cuDNN version:
  • GPU model and memory:
  • Exact command to reproduce:

You can collect some of this information using our environment capture script:

https://github.com/tensorflow/tensorflow/tree/master/tools/tf_env_collect.sh

You can obtain the TensorFlow version with

python -c "import tensorflow as tf; print(tf.GIT_VERSION, tf.VERSION)"

Describe the problem

when I run lstm in tensorflow/contrib/timeseries/examples/lstm.py, I tried to add methods to save model into savedModel, but it gives back errors.

File "/Users/yang/.local/lib/python3.4/site-packages/tensorflow/python/estimator/estimator.py", line 504, in export_savedmodel
serving_input_receiver = serving_input_receiver_fn()
File "/Users/yang/.local/lib/python3.4/site-packages/tensorflow/contrib/timeseries/python/timeseries/estimators.py", line 133, in _serving_input_receiver_fn
self._model.initialize_graph()
TypeError: initialize_graph() missing 1 required positional argument: 'input_statistics'

The issue I guess is that, in self._model.initialize_graph(), no parameters are given, but in

def initialize_graph(self, input_statistics):
    """Save templates for components, which can then be used repeatedly.
    This method is called every time a new graph is created. It's safe to start
    adding ops to the current default graph here, but the graph should be
    constructed from scratch.
    Args:
      input_statistics: A math_utils.InputStatistics object.
    """
    super(_LSTMModel, self).initialize_graph(input_statistics=input_statistics)
    with tf.variable_scope("", use_resource=True):
      # Use ResourceVariables to avoid race conditions.
      self._lstm_cell = tf.nn.rnn_cell.LSTMCell(num_units=self._num_units)
      # Create templates so we don't have to worry about variable reuse.
      self._lstm_cell_run = tf.make_template(
          name_="lstm_cell",
          func_=self._lstm_cell,
          create_scope_now_=True)
      # Transforms LSTM output into mean predictions.
      self._predict_from_lstm_output = tf.make_template(
          name_="predict_from_lstm_output",
          func_=functools.partial(tf.layers.dense, units=self.num_features),
          create_scope_now_=True)

one param input_statistics is asked. But how to fix this issue

Source code / logs

serving_input_receiver_fn = estimator.build_raw_serving_input_receiver_fn()
estimator.export_savedmodel(
    "../model",
    serving_input_receiver_fn
)
@allenlavoie
Copy link
Member

Thank you for the report! Looks like there are a couple issues with the example. I have a fix in the works which will include exporting in the example.

@yangfengKAUST
Copy link
Author

Could you please tell me how to fix this issue or, what I can do to get the correct version? Because I wanna put this model into savedModel or savedBundle.

@allenlavoie
Copy link
Member

Getting the fix code reviewed is taking a bit longer than I was thinking. Here's the patch:

diff --git a/tensorflow/contrib/timeseries/examples/lstm.py b/tensorflow/contrib/timeseries/examples/lstm.py
index c834430b9..630f4fc05 100644
--- a/tensorflow/contrib/timeseries/examples/lstm.py
+++ b/tensorflow/contrib/timeseries/examples/lstm.py
@@ -20,12 +20,14 @@ from __future__ import print_function
 
 import functools
 from os import path
+import tempfile
 
 import numpy
 import tensorflow as tf
 
 from tensorflow.contrib.timeseries.python.timeseries import estimators as ts_estimators
 from tensorflow.contrib.timeseries.python.timeseries import model as ts_model
+from tensorflow.contrib.timeseries.python.timeseries import state_management
 
 try:
   import matplotlib  # pylint: disable=g-import-not-at-top
@@ -70,7 +72,7 @@ class _LSTMModel(ts_model.SequentialTimeSeriesModel):
     self._lstm_cell_run = None
     self._predict_from_lstm_output = None
 
-  def initialize_graph(self, input_statistics):
+  def initialize_graph(self, input_statistics=None):
     """Save templates for components, which can then be used repeatedly.
 
     This method is called every time a new graph is created. It's safe to start
@@ -168,12 +170,15 @@ class _LSTMModel(ts_model.SequentialTimeSeriesModel):
 
 
 def train_and_predict(
-    csv_file_name=_DATA_FILE, training_steps=200, estimator_config=None):
+    csv_file_name=_DATA_FILE, training_steps=200, estimator_config=None,
+    export_directory=None):
   """Train and predict using a custom time series model."""
   # Construct an Estimator from our LSTM model.
   estimator = ts_estimators.TimeSeriesRegressor(
       model=_LSTMModel(num_features=5, num_units=128),
-      optimizer=tf.train.AdamOptimizer(0.001), config=estimator_config)
+      optimizer=tf.train.AdamOptimizer(0.001), config=estimator_config,
+      # Set state to be saved across windows.
+      state_manager=state_management.ChainingStateManager())
   reader = tf.contrib.timeseries.CSVReader(
       csv_file_name,
       column_names=((tf.contrib.timeseries.TrainEvalFeatures.TIMES,)
@@ -192,6 +197,28 @@ def train_and_predict(
   predicted_mean = numpy.squeeze(numpy.concatenate(
       [evaluation["mean"][0], predictions["mean"]], axis=0))
   all_times = numpy.concatenate([times, predictions["times"]], axis=0)
+
+  # Export the model in SavedModel format.
+  if export_directory is None:
+    export_directory = tempfile.mkdtemp()
+  input_receiver_fn = estimator.build_raw_serving_input_receiver_fn()
+  export_location = estimator.export_savedmodel(
+      export_directory, input_receiver_fn)
+
+  # Predict using the SavedModel
+  with tf.Graph().as_default():
+    with tf.Session() as session:
+      signatures = tf.saved_model.loader.load(
+          session, [tf.saved_model.tag_constants.SERVING], export_location)
+      saved_model_output = (
+          tf.contrib.timeseries.saved_model_utils.predict_continuation(
+              continue_from=evaluation, signatures=signatures,
+              session=session, steps=100))
+      # The exported model gives the same results as the Estimator.predict()
+      # call above.
+      numpy.testing.assert_allclose(
+          predictions["mean"],
+          numpy.squeeze(saved_model_output["mean"], axis=0))
   return times, observed, all_times, predicted_mean
 
 
diff --git a/tensorflow/contrib/timeseries/examples/lstm_test.py b/tensorflow/contrib/timeseries/examples/lstm_test.py
index 3cace5672..ca56e38ca 100644
--- a/tensorflow/contrib/timeseries/examples/lstm_test.py
+++ b/tensorflow/contrib/timeseries/examples/lstm_test.py
@@ -36,7 +36,8 @@ class LSTMExampleTest(test.TestCase):
   def test_periodicity_learned(self):
     (observed_times, observed_values,
      all_times, predicted_values) = lstm.train_and_predict(
-         training_steps=100, estimator_config=_SeedRunConfig())
+         training_steps=100, estimator_config=_SeedRunConfig(),
+         export_directory=self.get_temp_dir())
     self.assertAllEqual([100], observed_times.shape)
     self.assertAllEqual([100, 5], observed_values.shape)
     self.assertAllEqual([200], all_times.shape)

@yangfengKAUST
Copy link
Author

Awesome

@yangfengKAUST
Copy link
Author

It's OK to create a savedModel, however, I still confused on what's the inputs and outputs for the savedModel. Let's assume that, someone saved the model, for me, I don't know what's the exactly input or output segment is, in this condition, how can I get the result?

@allenlavoie
Copy link
Member

The inputs and outputs are exercised in saved_model_utils (called from the LSTM example): https://github.com/tensorflow/tensorflow/blob/master/tensorflow/contrib/timeseries/python/timeseries/saved_model_utils.py

I made a change last week which makes it easier to cold-start from a SavedModel; until that lands in the next push (should be today?) you need the output of an Estimator to get started like in the examples, or to feed in state manually (ugly / not recommended). Is that the issue?

@allenlavoie
Copy link
Member

It's synced: https://github.com/tensorflow/tensorflow/blob/master/tensorflow/contrib/timeseries/examples/lstm.py#L239

There's no intermediate state saved with the model, so it does need a sequence as input in order to start making sensible predictions. Alternatively you could save the state spit out by Estimator.evaluate() if you know you'll be predicting starting from the end of the evaluation data.

@tensorflowbutler
Copy link
Member

Nagging Assignee @allenlavoie: It has been 14 days with no activity and this issue has an assignee. Please update the label and/or status accordingly.

@allenlavoie
Copy link
Member

I think this is resolved, but feel free to follow up if something isn't clear.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

3 participants