-
Notifications
You must be signed in to change notification settings - Fork 19.4k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
"<tensor> is not an element of this graph." when loading model. #6462
Comments
I'm having a similar issue (I think) but it occurs when I call the Any suggestions? |
Change the backend to Theano. change it back when the issue is resolved. |
Hi silentsnooc, I'm facing a similar issue. I also need to load a keras model from an h5 file at the same time I generate a tf seq2seq model. When I load the models independently it works well, the problem appears when I try to integrate those two actions. Did you finally get to solve this issue? Thanks in advance |
I had a problem similar to that of @piraka9011 which was solved by calling |
This issue has been automatically marked as stale because it has not had recent activity. It will be closed after 30 days if no further activity occurs, but feel free to re-open a closed issue if needed. |
Just got this error all of a sudden (Was working fine earlier today).. |
Calling |
This worked for me:
While predicting, use the same graph
|
i had the same environment. env: flask + tensorflow + keras (dev env: windows 10 pro) ValueError: Tensor Tensor("avg_pool/AvgPool:0", shape=(?, 1, 1, 2048), dtype=float32) is not an element of this graph. this worked for me: global graph while predicting, use the same graph thanks to anujgupta82 |
I Solved! I made a function which runs without app.route(), the decorator. And I loaded model in the function. And then I called the function when I need prediction |
Same issue! |
@minus31 I'm confused. What did you mean? You load model every time you predict? PS. I use |
In my case, I have a flask endpoint that passes a key to a process_algorithm method. |
changing backend to theano worked for me |
@wqp89324 try calling keras.backend.clear_session() on each request |
@wqp89324 I had the same issue. Calling keras.backend.clear_session() each time before loading a model solved the problem. |
model._make_predict_function() which works perfectly within python 3.6.
|
I am also doing the same thing shown by you but then also i am etting the same error...First time i will get the output but when i am trying to click the link for second time it is giving the error.. Can u help me please ? |
I faced the same issue when I was passing an instance of loaded model to another thread which was predicting. I changed my execution and made the model load in the same thread as the one which was predicting. I am not sure if this solves anyone's problem but hope so it helps someone. |
from flask import Flask, request @app.route("/")
if name == "main": This is the code... Can u help me ? |
@yashmehta14 I was having a similar problem. I fixed it by adding a function which can be called later
|
@joaospinto hero! |
this has fixed it for me as well what does it do?? :) |
|
ran into the same problem when trying to deploy a CNN to production as a flask app, to summarize, there seem to be two fixes that worked with me
|
\\Pada tanggal 4 Jul 2019 07:23, Summers Wu <notifications@github.com> menulis:ran into the same problem when trying to deploy a CNN to production as a flask app, to summarize, there seem to be two fixes that worked with me
Switch to a Theanos backendAdd the line model._make_predict_function() right after you have loaded your model
—You are receiving this because you are subscribed to this thread.Reply to this email directly, view it on GitHub, or mute the thread.
|
thanks sir it works for me after struggling whole day looking for an answer |
I had the same issue while using flask + keras with graph.as_default(): this worked for me |
I had the same problem when using ResNet50. Solved the same as @Mrjaggu, with input from @anujgupta82:
|
Thanks to @Mrjaggu , @anujgupta82 , @GerardWalsh and others! |
This works for me |
using
|
@gustavz I have the same problem, could you solve it? |
Works with clean session before the model load and predict. from keras import backend as K
@api.expect()
def post(self):
K.clear_session()
loaded_model_f32 = load_model("app/Utils/model.h5")
file = request.files['file']
file.save("tmp.wav")
filestr = request.files['file'].read()
data = wav.read('tmp.wav')[1]
input_data = np.zeros((1, 50213))
input_data[0][:size] = data[:size]
predicted = loaded_model_f32.predict(input_data)
result = {"result":
{
"time_to_load_model": time_load_model,
"time_to_predict": time_predict_pattern,
"classe": CLASSES[np.argmax(predicted)]
}
}
return result |
If it works within REPL but not flask, try running the flask app with gunicorn and set mode to |
works perfect |
thank you! Your suggestion was a godsend. I had no idea what was causing the error |
Hello, thank you for the answer but, I am facing this error when using this function : |
Guys, if you are getting this error in flask environment. the solution is simple, create a function for a prediction and load a model in that function instead of declaring it globally. It will solve your error. |
@Spidy20 I don't think that is a good solution because it will require the model to be reloaded every time the predict API is called. For the record, I ran into an issue with |
Solve it by: Instead of : Use: if name == 'main': |
@YatinChachra Thanks for it, Actually it is the perfect solution. Yes, you are right @shivasuri my solution was temporary to solve the error. But this solution is perfect. Thanks for correcting me:) |
I was having the same problem like piraka9011 it was making error when predict function was called. but it works fine for me when i use piraka9011 single line of code |
It's solved it right ! ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~ but too slow ~ time: 38.32983899116516 |
The tensorflow version is tf1.X
and then ,predicet_model ,you can :
|
Update:
I am using the model which I am loading below just to generate features for another model. Could it be that the problem is because I am trying to load multiple models in one session?
When I load the model in word2vec_25_tar-category.h5.zip using
load_model
in one of my modules I am gettingHowever, I do not get this in all modules.
This is how I load it:
These are the custom objects:
Why can I use the model in one module but not in another? I am pretty helpless because I don't see what the issue is. I definitely load the same file.
What's wrong here?
The text was updated successfully, but these errors were encountered: