Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Regarding Deployment on Flask #37

Closed
ianuragbhatt opened this issue Dec 15, 2019 · 13 comments
Closed

Regarding Deployment on Flask #37

ianuragbhatt opened this issue Dec 15, 2019 · 13 comments
Labels
user question Further information is requested

Comments

@ianuragbhatt
Copy link

Hi, i have an issue regarding deployment i am not able to deploy ktrain multi text classification model. I tried to load model and .preproc file but it does not work.

@Bidek56
Copy link

Bidek56 commented Dec 18, 2019

Any specific error messages?

@amaiya
Copy link
Owner

amaiya commented Dec 20, 2019

For those of you who are trying to serve a ktrain model with Flask:

It looks like this is an issue with Flask/TensorFlow, not ktrain. The latest version of Flask causes a Session graph is empty error when trying to serve a TensorFlow model on TensorFlow 1.14. See this Keras issue for more information.

It apparently works in TensorFlow 2.0. However, when using a pre-v0.8 version of ktrain on TensorFlow 2, ktrain still runs in TensorFlow 1.x mode in order to support both TF 1.14 and 2.0 right now. This is why you see this error on both TF 1.14 and TF 2.0 when using ktrain.

This will no longer be a problem in ktrain v0.8 (which has not yet been released) because this version of ktrain will only support TensorFlow 2 (not TensorFlow 1.14).

For right now, the workaround is to downgrade Flask with: pip3 install flask==0.12.2. After doing this, you should be able to use Flask to serve a Keras model or ktrain predictor: For instance, I've verified the following toy example works:

# file name:  my_server.py
import flask
import ktrain
app = flask.Flask(__name__)
predictor = None
def load_predictor():
    global predictor
    predictor = ktrain.load_predictor('/tmp/mypred')
    if hasattr(predictor.model, '_make_predict_function'):
        predictor.model._make_predict_function()

@app.route('/predict', methods=['GET'])
def predict():
    data = {"success": False}
    if flask.request.method in ["GET"]:
        text = flask.request.args.get('text')
        if text is None: return flask.jsonify(data)
        prediction = predictor.predict(text)
        data['prediction'] = prediction
        data["success"] = True
    return flask.jsonify(data)

if __name__ == "__main__":
    load_predictor()
    port = 8888
    app.run(host='0.0.0.0', port=port)
    app.run()

After starting the server with python3 my_server.py, you can issue a prediction request to the server by opening your browser and typing: http://0.0.0.0:8888/predict?text=great%20movie

If the model was trained on IMDB, this should display the following in the browser:

prediction: "pos"
success: true

@Bidek56
Copy link

Bidek56 commented Dec 20, 2019

It's a good suggestion but it does not work for me, I get:
ValueError: Tensor Tensor("dense/Softmax:0", shape=(?, 2), dtype=float32) is not an element of this graph.

@Bidek56
Copy link

Bidek56 commented Dec 20, 2019

Has anyone tried using TensorFlow Serving with ktrain?

@amaiya
Copy link
Owner

amaiya commented Dec 20, 2019

I recall seeing that error, as well, but I think it went away after making sure the version of TensorFlow used to train the model (TensorFlow 2.0) was the same as the version used to serve the model (TensorFlow 2.0). If you're using TF 1.14/1.15, maybe also try upgrading to TF 2.0. Also, make sure there are no calls to standalone Keras in any of your code. The example above works with TF 2.0 and ktrain v 0.7.2.

See also this and this

@Bidek56
Copy link

Bidek56 commented Dec 20, 2019

I am using: tensorflow==2.0.0 and still get the error.

@amaiya
Copy link
Owner

amaiya commented Dec 20, 2019

See also this and this for other possible solutions.

@Bidek56
Copy link

Bidek56 commented Dec 20, 2019

It turns out that DEBUG = True in the code above was causing the error. Thanks for your help!!

@amaiya
Copy link
Owner

amaiya commented Dec 20, 2019

Great. I've edited the example above to include this and also removed debugging mode.

@ianuragbhatt
Copy link
Author

Hi Guys thanks for the response i already have deployed ktrain model by converting it into h5 and json type you can check this Notebook and if you have any suggestion for improvement please tell us

@amaiya amaiya added the user question Further information is requested label Dec 21, 2019
@amaiya amaiya closed this as completed Dec 21, 2019
@Bidek56
Copy link

Bidek56 commented Dec 23, 2019

Would anyone know why Flask==1.1.1 throws an:
ValueError: Tensor Tensor("dense/Softmax:0", shape=(?, 2), dtype=float32) is not an element of this graph. but it works with Flask==0.12.2? Thx

@bulioses
Copy link

bulioses commented Apr 9, 2021

Thank you.
How do I get to load and use a trained model (in .h5 and preproc files) for racial recognition with Ktrain?

@fabulouz1
Copy link

I just discovered ktrain and and it is so unique!
Has anyone tried to deploy image classification model with ktrain on flask?
If yes, please help with the code.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
user question Further information is requested
Projects
None yet
Development

No branches or pull requests

5 participants