Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Multiple output model Prediction #5331

Closed
fouadb66 opened this issue Feb 8, 2017 · 6 comments
Closed

Multiple output model Prediction #5331

fouadb66 opened this issue Feb 8, 2017 · 6 comments

Comments

@fouadb66
Copy link

fouadb66 commented Feb 8, 2017

I designed a multi-output model with 3 loss. here is the output of my model:
output1 = Dense(outoutClaas,activation='softmax')(BT1)
output2 = Dense(outclassL12,activation='softmax')(BT1)
output3 = Dense(outclassL3,activation='softmax')(BT1)
model = Model(input=inputs, output=[output1, output2,output3])

The model training well, but i am getting errors during the prediction:
model.predict(input)
ValueError: Tensor Tensor("Softmax:0", shape=(?, 7201), dtype=float32) is not an element of this graph.
7201 is the number of the classes of the first output.
My questions is how can it get the prediction from output1, output2, output3, separately?

Thank you for your help...

@unrealwill
Copy link

Hello,
Your model should work. and predict the 3 outputs simultaneously.
If you want you can create 3 different models which share the same weights, but not necessarily the same inputs
...Your previous code
model1 = Model( input = [inputsfor1], output = [output1] )
model2 = Model( input = [inputsfor2], output = [output2] )
model3 = Model( input = [inputsfor3], output = [output3] )

@nijianmo
Copy link

Hi fouadb66, have you solved your question?

I am not sure how you train your models? Since these 3 models share the same layer, I think train the models separately will not lead to global optimal.

@shalabhsingh
Copy link

@fchollet This is still not solved. Any alternative method to predict the outputs in such a case? Thanks in advance.

@fouadb66 Did you find any solution to this problem?

@shalabhsingh
Copy link

This issue has been handled in #2397 . The problem is not in output prediction, but occurs when doing inference in a different thread than where you load the model.

Hope it helps.

@cshenton
Copy link

cshenton commented Jul 23, 2017

For reference, since there's two components to the solution.

  1. Calling predict from another thread requires pre-compiling the predict function:
from keras.models import Model

# in the main thread
my_model = Model(inputs = [inputs], outputs = [outputs])
my_model._make_predict_function()
# my_model.compile(... if you're doing training as well

# in another thread
my_model.predict(inputs) # should work now
  1. Calling predict in the context of a callback
    This could be either with asyncio using loop.run_in_executor, or using concurrent.futures directly, either way, if you do the above, you'll get told:
RuntimeError: The Session graph is empty.  Add operations to the graph before calling run().

So you'll need to manually carry around a copy of the tensorflow graph that keras creates and use it as context when you do inference in a thread:

import tensorflow as tf
from keras.models import Model

# in the main thread
my_model = Model(inputs = [inputs], outputs = [outputs])
my_model._make_predict_function()
# my_model.compile(... if you're doing training as well
graph = tf.get_default_graph()

# in another thread
with graph.as_default():
    my_model.predict(inputs) # should work now

It's a pain, and really should be handled by Keras, but at least it works, so you can do something like serve your keras model using an asynchronous webserver to handle requests, and a threadpool to do the actual inference.

@stale
Copy link

stale bot commented Oct 21, 2017

This issue has been automatically marked as stale because it has not had recent activity. It will be closed after 30 days if no further activity occurs, but feel free to re-open a closed issue if needed.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

5 participants