Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Convert Keras model to TensorFlow #3223

Closed
JozoVilcek opened this issue Jul 14, 2016 · 49 comments
Closed

Convert Keras model to TensorFlow #3223

JozoVilcek opened this issue Jul 14, 2016 · 49 comments

Comments

@JozoVilcek
Copy link

We have models produced by Keras from our researchers. For production deployment, we want run pure TensorFlow.
How is it possible to convert Keras model to a TensorFlow? I understand that Keras must be doing this as it supports TensorFlow runtime.

Thanks!

@fchollet
Copy link
Member

When you are using the TensorFlow backend, your Keras code is actually building a TF graph. You can just grab this graph.

Keras only uses one graph and one session. You can access the session via: K.get_session(). The graph associated with it would then be: K.get_session().graph.

@fchollet
Copy link
Member

Also this should be relevant to you: http://blog.keras.io/keras-as-a-simplified-interface-to-tensorflow-tutorial.html

@GeorgianaPetria
Copy link

I have a CNN model built with Keras and I am trying to run in on Android. In order to do this, I need to convert my model to TF.
I've been struggling in the last days to find the right documentation. Based on your earlier suggestion, I'm trying TF-serve, but I find it rather complicated and I still don't know if I'm doing the right thing. I'm dealing with problems like where to put my project inside the tensorflow-serve hierarchy (I'm normally using docker with TF), or why it takes forever to build with bazel...
Do you know if there's any step-by-step tutorial for dummies on how to convert Keras models to TF?

Thanks!

@petrtsatsin
Copy link

The above tutorial is outdated. I made it work by replacing
from tensorflow_serving.session_bundle import exporter
to from tensorflow.contrib.session_bundle import exporter. However, it still didn't solve my problem. I new to save keras model in tf.SavedModel format. Any suggestions.

@amir-abdi
Copy link

Here is my sample code:
https://github.com/amir-abdi/keras_to_tensorflow

Just set the parameters, and run.

@philnguyenresson
Copy link

Nice job amir, I haven't gotten around to trying your method yet.

I'm currently trying to use tf.SavedModel as mentioned above. Saving the meta-graph seems to work fine. but on reloading, most tutorials just stop at the loading part. We save the information about input/output inside of signature_def inside the pb , however I'm unsure of how to read in the information from signature_def.

@avspavan
Copy link

avspavan commented Jul 6, 2017

Any suggestions on converting TF model to Keras? Thanks

@jerrypaytm
Copy link

Anyone has the example in the reverse way? Tensorflow to Keras?

@VajiraPrabuddhaka
Copy link

@amir-abdi Thank you for your post, I tried your script and I successfully created protobuf model. So can you tell me whether It is possible to use this model in android device with TFdetect in here

@amir-abdi
Copy link

@VajiraPrabuddhaka I have successfully deployed the converted TF models on Android. So, yes, the models work on Android.

@VajiraPrabuddhaka
Copy link

@amir-abdi did you test it with TF detect example here??

@amir-abdi
Copy link

@VajiraPrabuddhaka No, I have not.

@sandys
Copy link

sandys commented Sep 10, 2017

@VajiraPrabuddhaka @amir-abdi just wanted to reconfirm if you are able to seamlessly export keras models and use them in android tensorflow examples ? we are debating between tensorflow and keras - and this would be very helpful

@jacknlliu
Copy link

A very interesting issue! I think this should be an example code here.

@VajiraPrabuddhaka
Copy link

@sandys no it is not. I'm still searching on it. I think this may be helpful. Try on that.

@amir-abdi
Copy link

@sandys I can confirm that I have converted several Keras models to TensorFlow models using this code and deployed the models on Android phone.
I have not checked the tensorflow example codes which you are referring to.

@anilmaddala
Copy link

@amir-abdi tried your sample code. The generated .pb file is valid and I am able to load it into my Android application. However, the results from TensorflowInferenceInterface seem to be wrong. The results dont match to my Keras testing results.

@amir-abdi
Copy link

@anilmaddala I wouldn't know what the source of error might be. However, I can confirm that we have used the same code to convert our models to tensorflow, deployed them on Android, and the results were OK. Our data are medical images.

@anilmaddala
Copy link

@amir-abdi re-did my training and app, your process works successfully! for MNIST CNN trained in Keras. I am able to port Keras -> Tensorflow -> Android and classify input digits.

However I am having trouble with LSTM/GRU, any idea what needs to change? Thanks

@amir-abdi
Copy link

@anilmaddala My models have LSTMs and they are converted with no problem using the same code (https://github.com/amir-abdi/keras_to_tensorflow).

My only hypothesis is that maybe your models were not generated using the latest version of the library. Try updating your libraries.

@anil80
Copy link

anil80 commented Dec 15, 2017

@amir-abdi I tried using your script to convert my eras .h5 model to tensorflow .pb file but it does not seem to work. The script generates a .pb file but the print out of output nodes does not show any nodes at all. Which seems to suggest that the tensorflow model conversion did not go through correctly. Below is the full output of the script -

UserWarning: No training configuration found in save file: the model was not compiled. Compile it manually.
warnings.warn('No training configuration found in save file: '
('output nodes names are: ', ['output_node0'])
INFO:tensorflow:Froze 378 variables.
Converted 378 variables to const ops.
('saved the freezed graph (ready for inference) at: ', './inception_v3_coreml.pb')

I have already tested the Keras model by calling the model.predict() function and it works. So I know that the .h5 file is correct

The versions I am using are -
Keras version = 2.0.0
Tensorflow version = 1.4.1

Any help will be greatly appreciated

@amir-abdi
Copy link

@anil80 Is your problem resolved?
Check the new version of my code; hope it helps.
Or let me know if the problem persists.

@adityabansal123
Copy link

Hi @amir-abdi ..I tried your script but my couldnt do it..

@amir-abdi
Copy link

@adityabansal123 Are you talking about my code here: https://github.com/amir-abdi/keras_to_tensorflow ?
Can you elaborate? What went wrong in the conversion?
And what versions of tensorflow and keras libraries are you using?

@trangtv57
Copy link

trangtv57 commented Feb 5, 2018

hi @amir-abdi I try use your code, but its failed, error is:
no model found in config file:
image
i train on keras: 2.1.2 and tensorflow 1.4.1

@amir-abdi
Copy link

amir-abdi commented Feb 5, 2018

@trangtv57
This is already answered in the issues of my github repo. See this and let me know if the problem persists.
amir-abdi/keras_to_tensorflow#18 (comment)

@trangtv57
Copy link

trangtv57 commented Feb 6, 2018

I already have model file and weight file, and still can load model trained for using predict by load_model, however when I try use your code @amir-abdi its show error:

File "/usr/local/lib/python3.5/dist-packages/keras/models.py", line 234, in load_model
with h5py.File(filepath, mode='r') as f:
File "/usr/local/lib/python3.5/dist-packages/h5py/_hl/files.py", line 271, in init
fid = make_fid(name, mode, userblock_size, fapl, swmr=swmr)
File "/usr/local/lib/python3.5/dist-packages/h5py/_hl/files.py", line 101, in make_fid
fid = h5f.open(name, flags, fapl=fapl)
File "h5py/_objects.pyx", line 54, in h5py._objects.with_phil.wrapper (/tmp/pip-huypgcah-build/h5py/_objects.c:2840)
File "h5py/_objects.pyx", line 55, in h5py._objects.with_phil.wrapper (/tmp/pip-huypgcah-build/h5py/_objects.c:2798)
File "h5py/h5f.pyx", line 78, in h5py.h5f.open (/tmp/pip-huypgcah-build/h5py/h5f.c:2117)
OSError: Unable to open file (File signature not found)

@trangtv57
Copy link

thanks you @amir-abdi ,
I know how to load model with weight and model json in keras, so I am done for this work, thanks you again.

@mengxiabing
Copy link

@trangtv57 How to solve it, I met the same error

@trangtv57
Copy link

@mengxiabing you should show me the error detail.

@mengxiabing
Copy link

@trangtv57 Please help analysis
D:\ai\keras_to_tensorflow-master>python keras_to_tensorflow.py -input_model_file
'model.h5' -output_model_file 'model.pb'
usage: keras_to_tensorflow.py [-h] [-input_fld INPUT_FLD]
[-output_fld OUTPUT_FLD]
[-input_model_file INPUT_MODEL_FILE]
[-output_model_file OUTPUT_MODEL_FILE]
[-output_graphdef_file OUTPUT_GRAPHDEF_FILE]
[-num_outputs NUM_OUTPUTS]
[-graph_def GRAPH_DEF]
[-output_node_prefix OUTPUT_NODE_PREFIX]
[-quantize QUANTIZE]
[-theano_backend THEANO_BACKEND] [-f F]

set input arguments

optional arguments:
-h, --help show this help message and exit
-input_fld INPUT_FLD
-output_fld OUTPUT_FLD
-input_model_file INPUT_MODEL_FILE
-output_model_file OUTPUT_MODEL_FILE
-output_graphdef_file OUTPUT_GRAPHDEF_FILE
-num_outputs NUM_OUTPUTS
-graph_def GRAPH_DEF
-output_node_prefix OUTPUT_NODE_PREFIX
-quantize QUANTIZE
-theano_backend THEANO_BACKEND
-f F
input args: Namespace(f=None, graph_def=False, input_fld='.', input_model_file=
"'model.h5'", num_outputs=1, output_fld='', output_graphdef_file='model.ascii',
output_model_file="'model.pb'", output_node_prefix='output_node', quantize=False
, theano_backend=False)
Using TensorFlow backend.
Traceback (most recent call last):
File "keras_to_tensorflow.py", line 114, in
net_model = load_model(weight_file_path)
File "D:\Program Files\Anaconda3\lib\site-packages\keras\models.py", line 237,
in load_model
with h5py.File(filepath, mode='r') as f:
File "D:\Program Files\Anaconda3\lib\site-packages\h5py_hl\files.py", line 27
2, in init
fid = make_fid(name, mode, userblock_size, fapl, swmr=swmr)
File "D:\Program Files\Anaconda3\lib\site-packages\h5py_hl\files.py", line 92
, in make_fid
fid = h5f.open(name, flags, fapl=fapl)
File "h5py_objects.pyx", line 54, in h5py._objects.with_phil.wrapper (C:\Mino
nda\conda-bld\h5py_1474482825505\work\h5py_objects.c:2705)
File "h5py_objects.pyx", line 55, in h5py._objects.with_phil.wrapper (C:\Mino
nda\conda-bld\h5py_1474482825505\work\h5py_objects.c:2663)
File "h5py\h5f.pyx", line 76, in h5py.h5f.open (C:\Minonda\conda-bld\h5py_1474
482825505\work\h5py\h5f.c:1951)
OSError: Unable to open file (Unable to open file: name = ''model.h5'', errno =
2, error message = 'no such file or directory', flags = 0, o_flags = 0)

@trangtv57
Copy link

model.h5 you should provide absolute path of file, like /home/yourname/name_of_file. Because this error show function read file can't read file because it's not recognize that string: "model.h5" is file not found.

@amir-abdi
Copy link

as @trangtv57 mentioned, you either need to set the input folder by the input_fld argument, or run the script in the directory hosting the model file.
Also, make sure to set the model file correctly using the input_model_file argument.

@mengxiabing
Copy link

@trangtv57 Thank you, the new error, the custom layer cannot be converted
Using TensorFlow backend.
Input file specified (model.h5) only holds the weights, and not the model defeni
tion.
Save the model using mode.save(filename.h5) which will contain the network a
rchitecture
as well as its weights.
If the model is saved using model.save_weights(filename.h5), the model archi
tecture is
expected to be saved separately in a json format and loaded prior to loading
the weights.
Check the keras documentation for more details (https://keras.io/getting-sta
rted/faq/)
Traceback (most recent call last):
File "keras_to_tensorflow.py", line 123, in
raise err
File "keras_to_tensorflow.py", line 114, in
net_model = load_model(weight_file_path)
File "D:\Program Files\Anaconda3\lib\site-packages\keras\models.py", line 243,
in load_model
model = model_from_config(model_config, custom_objects=custom_objects)
File "D:\Program Files\Anaconda3\lib\site-packages\keras\models.py", line 317,
in model_from_config
return layer_module.deserialize(config, custom_objects=custom_objects)
File "D:\Program Files\Anaconda3\lib\site-packages\keras\layers_init_.py",
line 55, in deserialize
printable_module_name='layer')
File "D:\Program Files\Anaconda3\lib\site-packages\keras\utils\generic_utils.p
y", line 143, in deserialize_keras_object
list(custom_objects.items())))
File "D:\Program Files\Anaconda3\lib\site-packages\keras\engine\topology.py",
line 2507, in from_config
process_layer(layer_data)
File "D:\Program Files\Anaconda3\lib\site-packages\keras\engine\topology.py",
line 2493, in process_layer
custom_objects=custom_objects)
File "D:\Program Files\Anaconda3\lib\site-packages\keras\layers_init_.py",
line 55, in deserialize
printable_module_name='layer')
File "D:\Program Files\Anaconda3\lib\site-packages\keras\utils\generic_utils.p
y", line 137, in deserialize_keras_object
': ' + class_name)
ValueError: Unknown layer: BatchNorm

@trangtv57
Copy link

trangtv57 commented Mar 15, 2018

i wonder that you save your model to 2 file, model config, and weight or just file h5( file save weight). Because you show error make me difficult to understand your error
if you save model for 2 file, model config and weight seperate, you should load model with function load_model_from_json in keras to load model. not load model just from 1 file like you do.

@yfarjoun
Copy link

Doesn't work out-of-the-box for python 2.7, but the code is there and that enough for me! thanks!!

@gustavz
Copy link

gustavz commented May 7, 2018

@yfarjoun could you share your version thats working for py2.7?

@yfarjoun
Copy link

yfarjoun commented May 9, 2018

I don't have this code in 2.7, but the heart of the matter is

    import keras as K

    from tensorflow.python.framework import graph_util
    from tensorflow.python.framework import graph_io

    from keras.models import Model, load_model
    sess = K.get_session()
    model=load_model("model.hd5")

    constant_graph = graph_util.convert_variables_to_constants(sess, sess.graph.as_graph_def(), pred_node_names)

    output="graph.pb"
    graph_io.write_graph(constant_graph, "/", output, as_text=False)
    print('saved the freezed graph (ready for inference) at: ', output)

@dabercro
Copy link

@yfarjoun Thanks! I just had to make the minor change of K.get_session() to K.backend.get_session().

@vikitorentino
Copy link

Above mentioned methods are working on minor changes but can you suggest a way to load converted .pb to .h5 model again

@hpeiyan
Copy link

hpeiyan commented Sep 7, 2018

I found it's still hard to convert keras or tensorflow model to tensorflow lite format. There're not any tutorials about this.

@rafiqhasan
Copy link

I don't know if it is still useful for anyone, I referred to a standard code provided by Google on the Cloudml git for a similar requirement where they are converting Keras' HDF5 model to a Tensorflow serving model. You can try to reuse it:

https://github.com/GoogleCloudPlatform/cloudml-samples/blob/master/census/keras/trainer/model.py

def to_savedmodel(model, export_path):
  """Convert the Keras HDF5 model into TensorFlow SavedModel."""

  builder = saved_model_builder.SavedModelBuilder(export_path)

  signature = predict_signature_def(inputs={'input': model.inputs[0]},
                                    outputs={'income': model.outputs[0]})

  with K.get_session() as sess:
    builder.add_meta_graph_and_variables(
        sess=sess,
        tags=[tag_constants.SERVING],
        signature_def_map={
            signature_constants.DEFAULT_SERVING_SIGNATURE_DEF_KEY: signature}
    )
    builder.save()

@sanketwagh06
Copy link

Here is my sample code:
https://github.com/amir-abdi/keras_to_tensorflow

Just set the parameters, and run.

@amir-abdi Can you please provide the versions of Keras and Tensorflow you have used? I am facing this issue using your repo.

ImportError: cannot import name 'relu6'

@amir-abdi
Copy link

amir-abdi commented Oct 11, 2018

Firstly, checkout the updated keras_to_tensorflow tool.

Moreover, you are trying to convert (I assume it is something similar to mobilenet). Your model has defined custom layers which are not being recognized by you keras library. Check this thread.

You can introduce custom layers to keras for your model to load like this:

from keras.utils.generic_utils import CustomObjectScope

with CustomObjectScope({'relu6': keras.applications.mobilenet.relu6,
                         'DepthwiseConv2D': keras.applications.mobilenet.DepthwiseConv2D}):
model = load_model('weights.hdf5')

@RidaFatima95
Copy link

RidaFatima95 commented Nov 23, 2018

Good work @amir-abdi /i need help in signature Defs part. I have to deploy my code on google cloud which basically consists of two lstm models working asynchronously to predict a sentence as an output. I have build my model in keras and converted my hdf5 file into pb using @amir-abdi tool. Now I want to know that how can i access my signatureDefs parameters like input output and model as mentioned here?
As in order to load my model using savedModel.loader() like this
with tf.Session(graph=tf.Graph()) as sess: tf.saved_model.loader.load(sess, ["foo-tag"], export_dir)
...
I need to first save it using
`builder = tf.saved_model.builder.SavedModelBuilder(export_dir)

with tf.Session(graph=tf.Graph()) as sess:

builder.add_meta_graph_and_variables(sess,
["foo-tag"],
signature_def_map=foo_signatures,
assets_collection=foo_assets)

with tf.Session(graph=tf.Graph()) as sess:

builder.add_meta_graph(["bar-tag", "baz-tag"],
assets_collection=bar_baz_assets)

builder.save()`

as given here
but as the saving part is already done using your model. Again back to my question How can I access my signatureDefs parameters.
Kindly help me on this. And also if someone can explain me about signatureDef and its link with protobuf (.pb) file in simple words?
Thank you!

@ysyyork
Copy link

ysyyork commented Dec 21, 2018

I know there are a lot of scripts online that can easily convert a keras model to a tf model but just wondering why keras team doesn't wanna include this util function into keras so that people doesn't need to look at SO or github to find the solutions. Is there any consideration of not doing that? if not probably I can contribute directly. Let me know. Thanks!

@manupillai308
Copy link

Got here to search for a solution but fortunately found one myself.

import keras as K
import tensorflow as tf

model_path = '.' # provide the path to .h5 file
model = K.models.load_model(model_path)
sess = K.backend.get_session()
graph = sess.graph
saver = tf.train.Saver()
tensorflow_ckpt_path = '.' # provide the path to save the model
saver.save(sess, tensorflow_ckpt_path)

This seems to work for me.
Hope this helps you also.
😄

@jinhshi
Copy link

jinhshi commented Mar 21, 2019

@amir-abdi Hi~
I tried your tool https://github.com/amir-abdi/keras_to_tensorflow,and successfully converting model.h5 and model.json of inceptionV3 to model.pb.But when I use the model.pb on https://github.com/tensorflow/tensorflow/blob/master/tensorflow/java/src/main/java/org/tensorflow/examples/LabelImage.java ,the result of predict is confused. The categories should be 5,but the results vary only in two categories.

the version I used:
tensorflow:1.13.1
keras:2.2.4

@hashimi1998
Copy link

i still do not understand how to use this code. Anyone can help me? I have model.h5 and want to convert it to model.pb but I do not know how to do it using this code. Where should I change in the code?

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests