Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Use JSON instead of protocol buffers in the TensorFlow saved model converter. #886

Closed
nsthorat opened this issue Nov 7, 2018 · 5 comments · Fixed by tensorflow/tfjs-converter#264
Assignees

Comments

@nsthorat
Copy link
Contributor

nsthorat commented Nov 7, 2018

There are many situations we want to simply inspect the model topology to debug an issue, however the protocol buffer makes it extremely hard. The protocol buffer also makes us depend on protobufjs, which is a dependency which we should be able to remove.

It would be great to convert the model definition over to JSON (even simply with a toJSON() call on the proto during conversion). We can use a typescript interface for the JSON object on the other side.

This will also give us parity between the Keras / TensorFlow SavedModel worlds.

@Raza25
Copy link

Raza25 commented Nov 14, 2018

I am having problems with .pb model while inspecting it.

  • I used tfjs-converter to convert the keras frozen graph to javascript format.
  • Loaded the model and weights manifest using tf.loadFrozenModel().
  • However I am facing difficulties while inspecting it.

Now I need to convert .pb to json and then use the model further?
How does toJSON() call work?

@nsthorat
Copy link
Contributor Author

cc @pyu10055 for advice on how to inspect the model topology proto.

@Raza25
Copy link

Raza25 commented Nov 17, 2018

@pyu10055 could you assist in converting .pb to json.
If not, is there any help available regarding using loaded .pb model in tensorflowjs and its inspection.

@pyu10055
Copy link
Collaborator

pyu10055 commented Nov 27, 2018

@Raza25 The model returns from loadFrozenModel() call contains an attribute graph of type GraphExecutor, which has a graph attribution that contains all the nodes of the graph.
The kera layer information is lost when saving Keras to graph, but the node name can give you some hints on which variable scope the nodes belong to.

basically

const model = tf.loadFrozenModel(...); console.log(model.executor.graph.nodes);

@nsthorat
Copy link
Contributor Author

nsthorat commented Feb 6, 2019

Reopening this issue to track actually deleting all the protocol buffer code and removing the dependency before 1.0.

@nsthorat nsthorat moved this from To do to In progress in TensorFlow.js 1.0 Feb 7, 2019
@dsmilkov dsmilkov closed this as completed Mar 5, 2019
TensorFlow.js 1.0 automation moved this from In progress to Done Mar 5, 2019
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
No open projects
Development

Successfully merging a pull request may close this issue.

4 participants