-
Notifications
You must be signed in to change notification settings - Fork 74k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
how to save model for tensroflwo serving for lstm in tensorflow/contrib/timeseries/examples/lstm.py #16590
Comments
Thank you for the report! Looks like there are a couple issues with the example. I have a fix in the works which will include exporting in the example. |
Could you please tell me how to fix this issue or, what I can do to get the correct version? Because I wanna put this model into savedModel or savedBundle. |
Getting the fix code reviewed is taking a bit longer than I was thinking. Here's the patch:
|
Awesome |
It's OK to create a savedModel, however, I still confused on what's the inputs and outputs for the savedModel. Let's assume that, someone saved the model, for me, I don't know what's the exactly input or output segment is, in this condition, how can I get the result? |
The inputs and outputs are exercised in saved_model_utils (called from the LSTM example): https://github.com/tensorflow/tensorflow/blob/master/tensorflow/contrib/timeseries/python/timeseries/saved_model_utils.py I made a change last week which makes it easier to cold-start from a SavedModel; until that lands in the next push (should be today?) you need the output of an Estimator to get started like in the examples, or to feed in state manually (ugly / not recommended). Is that the issue? |
It's synced: https://github.com/tensorflow/tensorflow/blob/master/tensorflow/contrib/timeseries/examples/lstm.py#L239 There's no intermediate state saved with the model, so it does need a sequence as input in order to start making sensible predictions. Alternatively you could save the state spit out by Estimator.evaluate() if you know you'll be predicting starting from the end of the evaluation data. |
Nagging Assignee @allenlavoie: It has been 14 days with no activity and this issue has an assignee. Please update the label and/or status accordingly. |
I think this is resolved, but feel free to follow up if something isn't clear. |
Please go to Stack Overflow for help and support:
https://stackoverflow.com/questions/tagged/tensorflow
If you open a GitHub issue, here is our policy:
Here's why we have that policy: TensorFlow developers respond to issues. We want to focus on work that benefits the whole community, e.g., fixing bugs and adding features. Support only helps individuals. GitHub also notifies thousands of people when issues are filed. We want them to see you communicating an interesting problem, rather than being redirected to Stack Overflow.
System information
You can collect some of this information using our environment capture script:
https://github.com/tensorflow/tensorflow/tree/master/tools/tf_env_collect.sh
You can obtain the TensorFlow version with
python -c "import tensorflow as tf; print(tf.GIT_VERSION, tf.VERSION)"
Describe the problem
when I run lstm in tensorflow/contrib/timeseries/examples/lstm.py, I tried to add methods to save model into savedModel, but it gives back errors.
File "/Users/yang/.local/lib/python3.4/site-packages/tensorflow/python/estimator/estimator.py", line 504, in export_savedmodel
serving_input_receiver = serving_input_receiver_fn()
File "/Users/yang/.local/lib/python3.4/site-packages/tensorflow/contrib/timeseries/python/timeseries/estimators.py", line 133, in _serving_input_receiver_fn
self._model.initialize_graph()
TypeError: initialize_graph() missing 1 required positional argument: 'input_statistics'
The issue I guess is that, in self._model.initialize_graph(), no parameters are given, but in
one param input_statistics is asked. But how to fix this issue
Source code / logs
The text was updated successfully, but these errors were encountered: