Using retrain.py to export model for ML Engine #118
Labels
hub
For all issues related to tf hub library and tf hub tutorials or examples posted by hub team
stat:awaiting response
subtype:Image Retraining
type:support
Hi
I've used retrain.py with the --saved_model_dir flag to export a model that I've deployed to ML Engine.
When trying to use it for prediction I get an error saying that it expects to float32 input but receives a string.
{"error": "Prediction failed: Error processing input: Expected float32, got '\\xff\\xd8\\xff\\xe0 ...' of type 'str' instead."}
My request looks like this
{"image": {"b64": "/9j/4AAQxh6AP/2Q== ..."}}
Am I missing something? Is the export method from retrain.py not meant for exporting to ML Engine? Can I modify it to do so?
The text was updated successfully, but these errors were encountered: