New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Use JSON instead of protocol buffers in the TensorFlow saved model converter. #886
Comments
I am having problems with .pb model while inspecting it.
|
cc @pyu10055 for advice on how to inspect the model topology proto. |
@pyu10055 could you assist in converting .pb to json. |
@Raza25 The model returns from loadFrozenModel() call contains an attribute graph of type GraphExecutor, which has a graph attribution that contains all the nodes of the graph. basically
|
Reopening this issue to track actually deleting all the protocol buffer code and removing the dependency before 1.0. |
There are many situations we want to simply inspect the model topology to debug an issue, however the protocol buffer makes it extremely hard. The protocol buffer also makes us depend on protobufjs, which is a dependency which we should be able to remove.
It would be great to convert the model definition over to JSON (even simply with a toJSON() call on the proto during conversion). We can use a typescript interface for the JSON object on the other side.
This will also give us parity between the Keras / TensorFlow SavedModel worlds.
The text was updated successfully, but these errors were encountered: