You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
I am building an app for android with a keras model exported (through tenserflow).
Yet, it crashes at runtime because there are DT_BOOL node in the graph and the tensorflow library I use doesn't have KernelOp for DT_BOOL registered out of the box. So, I would like to try getting read of all DT_BOOL noes (if it is possible) from the model that I only use for inference on the device.
Is it possible to strip a keras model from all DT_BOOL nodes that depens on keras_learning_phase ?
How can it be done ?
I tried this :
K.set_learning_phase(0) # all new operations will be in test mode from now on
serialize the model and get its weights, for quick re-building
And I also tried a few things on the tensorflow side (optimize_for_inference, freeze_graph, (wrongly) turning Switch node into Identity (I corrupted my graph), ...)
The text was updated successfully, but these errors were encountered:
I worked around this issue by exporting the model to pbtxt (protocol buffer text mode)
and then replacing
node {
name: "dropout_1/keras_learning_phase"
op: "Placeholder"
attr {
key: "dtype"
value {
type: DT_BOOL
}
}
attr {
key: "shape"
value {
shape {
unknown_rank: true
}
}
}
}
with a Const op
node {
name: "dropout_1/keras_learning_phase"
op: "Const"
attr {
key: "dtype"
value {
type: DT_BOOL
}
}
attr {
key: "value"
value {
tensor {
dtype: DT_BOOL
tensor_shape {
}
bool_val: false
}
}
}
}
and then I was able to freeze/transform the model and use it in an android application
Sadly, the nodes that depend from this boolean values aren't stripped though. So this isn't the best solution
In a completely new python process (ie. not in a process that I previously used for training), I use K.set_learning_phase(0) model = buildMyModel() model.load_weights('myWeights.hdf5') saver = tf.train.Saver() saver.save(K.get_session(), modelName)
and the model is saved without the nodes for learning/training
I am building an app for android with a keras model exported (through tenserflow).
Yet, it crashes at runtime because there are DT_BOOL node in the graph and the tensorflow library I use doesn't have KernelOp for DT_BOOL registered out of the box. So, I would like to try getting read of all DT_BOOL noes (if it is possible) from the model that I only use for inference on the device.
Is it possible to strip a keras model from all DT_BOOL nodes that depens on keras_learning_phase ?
How can it be done ?
I tried this :
K.set_learning_phase(0) # all new operations will be in test mode from now on
serialize the model and get its weights, for quick re-building
config = model.get_config()
weights = model.get_weights()
re-build a model where the learning phase is now hard-coded to 0
new_model = Sequential.from_config(config)
new_model.set_weights(weights)
And I also tried a few things on the tensorflow side (optimize_for_inference, freeze_graph, (wrongly) turning Switch node into Identity (I corrupted my graph), ...)
The text was updated successfully, but these errors were encountered: