New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Pop from empty context_switches when take outputs of one estimator.predict as inputs of another #20506
Comments
@mrry, could you please take a look? (Are you the right person)? |
Thanks for the straightforward reproducer, @miacro. It looks like this might be a bug in the model1_output = model1.predict(input_fn=get_input_fn([1, 2, 3, 4]))
model2_input_fn = get_input_fn(model1_output)
for item in model2.predict(model2_input_fn):
print(item) ...to the following: model1_output = model1.predict(input_fn=get_input_fn([1, 2, 3, 4]))
model1_output = list(model1_output)
model2_input_fn = get_input_fn(model1_output)
for item in model2.predict(model2_input_fn):
print(item) ...the program behaves correctly. I'll assign this to @ispirmustafa, since he made some changes to context handling in |
Nagging Assignee @ispirmustafa: It has been 46 days with no activity and this issue has an assignee. Please update the label and/or status accordingly. |
Nagging Assignee @ispirmustafa: It has been 61 days with no activity and this issue has an assignee. Please update the label and/or status accordingly. |
Estimator.predict yields predictions under a graph context. The way with handle graph context in TF make yielding an issue. For example following code has same error. I'll update the documentation of predict:
|
Updated the documentation: https://github.com/tensorflow/tensorflow/blob/master/tensorflow/python/estimator/estimator.py#L484 |
Hi, estimator.predict is to slow , i want to predict some text in my model for every 2 second. estimator.predict() function allways load the model from latest checkpoint. I want to load model only once in estimator.predict to get fast prediction. |
System information
Describe the problem
Currently I am working on making predictions of two models (outputs of model1 as inputs of model2) with large amount of inputs. I tried to use tf.data.Dataset.from_generator, while it seems there are some problems about context stack.
Source code / logs
Here is my example
Error log
The text was updated successfully, but these errors were encountered: