Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Keras 3 / TF SavedModel export issues #19458

Closed
wamuir opened this issue Apr 8, 2024 · 3 comments
Closed

Keras 3 / TF SavedModel export issues #19458

wamuir opened this issue Apr 8, 2024 · 3 comments
Assignees

Comments

@wamuir
Copy link

wamuir commented Apr 8, 2024

It seems like TF SavedModel is not accurately exporting a Keras 3.x model. At least, dtype is incorrect in some cases and layer names are not exported.
Minimal example:

x = tf.keras.layers.Input((10,), name="x", dtype="int32")
y = tf.keras.layers.Dense(2, name="y")(x)
m = tf.keras.Model(x, y)
tf.saved_model.save(m, '/path/to/saved_model')

Comparing saved_model_cli output of a TF SavedModel from this example with TF_USE_LEGACY_KERAS=1 (and tf_keras installed) against that of a TF SavedModel when TF_USE_ LEGACY_KERAS=0 results in the following diff:

-  inputs['x'] tensor_info:
-      dtype: DT_INT32
+  inputs['inputs'] tensor_info:
+      dtype: DT_FLOAT
       shape: (-1, 10)
-      name: serving_default_x:0
+      name: serving_default_inputs:0
 The given SavedModel SignatureDef contains the following output(s):
-  outputs['y'] tensor_info:
+  outputs['output_0'] tensor_info:
       dtype: DT_FLOAT
       shape: (-1, 2)
       name: StatefulPartitionedCall:0
@fchollet
Copy link
Member

fchollet commented Apr 8, 2024

We don't recommend using tf.saved_model.save() for writing a SavedModel from a Keras model.

The recommended way is model.export("path"). If you call it on your model, you will see:

Saved artifact at 'path'. The following endpoints are available:

* Endpoint 'serve'
  args_0 (POSITIONAL_ONLY): TensorSpec(shape=(None, 10), dtype=tf.int32, name='x')
Output Type:
  TensorSpec(shape=(None, 2), dtype=tf.float32, name=None)

The input dtype and the input name are preserved. The output tensor name is not the name of the output layer, however. If you want to maintain that you will have to insert a tf.identity(..., name=...) op at the end of your model.

If you need more granular control over the endpoints and their signatures, you should use keras.export.ExportArchive.

If you want the behavior of tf.saved_model.save() to match model.export(), then you should file an issue with TensorFlow (we have no control over that function).

@wamuir
Copy link
Author

wamuir commented Apr 8, 2024

Thanks! That helps a lot.

We don't recommend using tf.saved_model.save() for writing a SavedModel from a Keras model.

Here's what led me down that path:

"Use `tf.saved_model.save()` if you want to export a SavedModel "
"for use with TFLite/TFServing/etc. "

@wamuir wamuir closed this as completed Apr 8, 2024
@fchollet
Copy link
Member

fchollet commented Apr 8, 2024

Good catch, we should update that comment. That path does work most of the time but it isn't nearly as reliable as the one we actually control (model.export()).

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

3 participants