Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Export Tensorflow SavedModel? #2991

Closed
AidanNelson opened this issue Nov 27, 2019 · 5 comments
Closed

Export Tensorflow SavedModel? #2991

AidanNelson opened this issue Nov 27, 2019 · 5 comments
Assignees
Labels

Comments

@AidanNelson
Copy link

@AidanNelson AidanNelson commented Nov 27, 2019

Hi There,

I am exploring the possibility of training a model in Unity using the ML-Agents package, then exporting that model for use with Tensorflow.js in the browser. Eventually, I'd like to incorporate that model into ml5.js, an approachable wrapper library around tf.js. I think this workflow would be an exciting use of the training / curriculum learning structure of Unity, and allow people to create models which could be used in online demonstrations of reinforcement learning (...or maybe agent-based AI for online games?)

Currently, it is possible to convert Tensorflow models for use with Tensorflow.js as long as they are in the SavedModel, Keras model, or TensorFlow Hub module format.

Describe the solution you'd like
I would like to be able to export a trained model from Unity in one of the above Tensorflow formats (SavedModel, Keras model, or TensorFlow Hub module), for later conversion to Tensorflow.js using existing solutions.

I see that in the current release of ML-Agents, Barracuda is used to store models, and that a Tensorflow to Barracuda conversion script exists, but am unsure how possible it would be to convert a model from Barracuda back to Tensorflow and the extent of this work.

Any information about this would be most appreciated!

Aidan

A (vaguely) related issue:
#1802

UPDATE (2019.12.2)
I looked further and realized that Unity Ml-Agents still uses Tensorflow for training, and only converts the models to Barracuda '.nn' files on export for inference. Now I am trying to figure out how to export a SavedModel format in this export_model function, for later conversion to TFJS-capable model. Does anyone have experience with using the saved model builder and a sense for how it could be implemented here? I've altered the export_model function to use tf.compat.v1.saved_model.Builder in the following code:

def export_model(self):
        """
        Exports latest saved model to .nn format for Unity embedding.
        """
        with self.graph.as_default():
  
            # MY ADDITIONAL CODE:
            builder = tf.compat.v1.saved_model.Builder(self.model_path + "/SavedModel/")
            builder.add_meta_graph_and_variables(self.sess,
                                                [tf.saved_model.tag_constants.TRAINING],
                                                strip_default_attrs=True)
            builder.add_meta_graph([tf.saved_model.tag_constants.SERVING], strip_default_attrs=True)
            builder.save()
            # END OF ADDITIONAL CODE
                                
           target_nodes = ",".join(self._process_graph())
           graph_def = self.graph.as_graph_def()
            
            output_graph_def = graph_util.convert_variables_to_constants(
                self.sess, graph_def, target_nodes.replace(" ", "").split(",")
            )
            frozen_graph_def_path = self.model_path + "/frozen_graph_def.pb"
            with gfile.GFile(frozen_graph_def_path, "wb") as f:
                f.write(output_graph_def.SerializeToString())
            tf2bc.convert(frozen_graph_def_path, self.model_path + ".nn")
            logger.info("Exported " + self.model_path + ".nn file")

This runs without error, but produces a SavedModel without any Signature Definitions, as seen when I run the python saved_model_cli.py show --dir ./ --all as per this:

MetaGraphDef with tag-set: 'serve' contains the following SignatureDefs:

MetaGraphDef with tag-set: 'train' contains the following SignatureDefs:

This then throws an error when converting to TFJS using the tfjs-converter:

ValueError: Signature 'serving_default' does not exist. The following signatures are available: KeysView(_SignatureMap({}))

along with many warnings like this:

WARNING:tensorflow:Unable to create a python object for variable <tf.Variable 'beta2_power:0' shape=() dtype=float32_ref> because it is a reference variable. It may not be visible to training APIs. If this is a problem, consider rebuilding the SavedModel after running tf.compat.v1.enable_resource_variables().

If anyone can shed light on how to include SignatureDefs in the exported model, and export a valid SavedModel from Unity ML-Agents, that would be awesome!

Thanks again.

@AidanNelson AidanNelson added the request label Nov 27, 2019
@AidanNelson AidanNelson changed the title Export Barracuda Model to Tensorflow Model Export Tensorflow SavedModel? Dec 2, 2019
@unityjeffrey unityjeffrey self-assigned this Dec 2, 2019
@unityjeffrey

This comment has been minimized.

Copy link
Collaborator

@unityjeffrey unityjeffrey commented Dec 2, 2019

Thank you for your comments. We’ve documented your feedback and will prioritize when appropriate.

@unityjeffrey unityjeffrey added discussion and removed request labels Dec 2, 2019
@unityjeffrey

This comment has been minimized.

Copy link
Collaborator

@unityjeffrey unityjeffrey commented Dec 2, 2019

also - opening this up as a more general discussion to see if folks in the community have attempted to do what you are describing above.

@AidanNelson

This comment has been minimized.

Copy link
Author

@AidanNelson AidanNelson commented Dec 9, 2019

For anyone trying to use models from Unity in TensorFlow.js, here is the approach I took.

  • Install Unity ML-Agents for development by following this guide here
  • In ml-agents/mlagents/trainers/tf_policy.py script, add the following lines to the export_model function to export a TensorFlow SavedModel. Note that the required graph nodes are different for continuous action space (i.e. the agent takes action on float values) and discrete action space (i.e. the agent takes action on integer values), so you will need to uncomment the respective lines accordingly:
def export_model(self):
        """
        Exports latest saved model to .nn format for Unity embedding.
        """

        with self.graph.as_default():
            graph_def = self.graph.as_graph_def()

            # BEGINNING OF ADDED CODE:
            # To learn more about nodes of the graph, uncomment the following lines:
            # for node in graph_def.node:
            #     print("-------")
            #     print(node.name)
            #     print(node.input)
            #     print(node.attr)
 
            # Uncomment for discrete vector action space:
            # vectorInputNode = self.graph.get_tensor_by_name("vector_observation:0")
            # actionMaskInput = self.graph.get_tensor_by_name("action_masks:0")
            # actionOutputNode = self.graph.get_tensor_by_name("action:0")
            # sigs = {}
            # sigs[tf.compat.v1.saved_model.signature_constants.DEFAULT_SERVING_SIGNATURE_DEF_KEY] = \
                tf.saved_model.signature_def_utils.predict_signature_def( \
                    {"in": vectorInputNode, "actionMask": actionMaskInput}, {"out": actionOutputNode})
            
            # Uncomment for continuous vector action space:
            # vectorInputNode = self.graph.get_tensor_by_name("vector_observation:0")
            # epsilonInputNode = self.graph.get_tensor_by_name("epsilon:0")
            # actionOutputNode = self.graph.get_tensor_by_name("action:0")
            # sigs = {}
            # sigs[tf.compat.v1.saved_model.signature_constants.DEFAULT_SERVING_SIGNATURE_DEF_KEY] = \
            #     tf.saved_model.signature_def_utils.predict_signature_def( \
            #         {"in": vectorInputNode,"epsilon": epsilonInputNode}, {"out": actionOutputNode})



            
            builder = tf.compat.v1.saved_model.Builder(self.model_path + "/SavedModel/")
            builder.add_meta_graph_and_variables( \
                self.sess, \
                [tf.saved_model.tag_constants.SERVING], \
                signature_def_map=sigs, \
                strip_default_attrs=True)
            builder.save()                                
            # END OF ADDED CODE
           
            target_nodes = ",".join(self._process_graph())
            
            output_graph_def = graph_util.convert_variables_to_constants(
                self.sess, graph_def, target_nodes.replace(" ", "").split(",")
            )
            frozen_graph_def_path = self.model_path + "/frozen_graph_def.pb"
            with gfile.GFile(frozen_graph_def_path, "wb") as f:
                f.write(output_graph_def.SerializeToString())
            tf2bc.convert(frozen_graph_def_path, self.model_path + ".nn")
            logger.info("Exported " + self.model_path + ".nn file")
     
  • the exported SavedModel should be able to be converted to a web model by the latest tensorflowjs-converter (tfjs version 1.4.0, currently)

If anyone can clarify the different output nodes: action vs. action_probs/action_probs, please let me know!

@stale

This comment has been minimized.

Copy link

@stale stale bot commented Dec 23, 2019

This issue has been automatically marked as stale because it has not had activity in the last 14 days. It will be closed in the next 14 days if no further activity occurs. Thank you for your contributions.

@stale stale bot added the stale label Dec 23, 2019
@stale

This comment has been minimized.

Copy link

@stale stale bot commented Jan 6, 2020

This issue has been automatically closed because it has not had activity in the last 28 days. If this issue is still valid, please ping a maintainer. Thank you for your contributions.

@stale stale bot closed this Jan 6, 2020
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
2 participants
You can’t perform that action at this time.