From 0841e859b0f83ffcd9145e00e0d8f8025e50ec6e Mon Sep 17 00:00:00 2001 From: Chris Elion Date: Wed, 29 Jul 2020 17:50:34 -0700 Subject: [PATCH 1/2] clarify support for external trainers or inference --- docs/Unity-Inference-Engine.md | 22 +++++++++++++++++++--- 1 file changed, 19 insertions(+), 3 deletions(-) diff --git a/docs/Unity-Inference-Engine.md b/docs/Unity-Inference-Engine.md index 213136d1a5..69f2b24649 100644 --- a/docs/Unity-Inference-Engine.md +++ b/docs/Unity-Inference-Engine.md @@ -7,9 +7,6 @@ your Unity games. This support is possible thanks to the [compute shaders](https://docs.unity3d.com/Manual/class-ComputeShader.html) to run the neural network within Unity. -**Note**: The ML-Agents Toolkit only supports the models created with our -trainers. - ## Supported devices See the Unity Inference Engine documentation for a list of the @@ -45,3 +42,22 @@ use for Inference. **Note:** For most of the models generated with the ML-Agents Toolkit, CPU will be faster than GPU. You should use the GPU only if you use the ResNet visual encoder or have a large number of agents with visual observations. + +# Unsupported systems +## Externally trained models +The ML-Agents Toolkit only supports the models created with our trainers. Model +loading expects certain conventions for constants and tensor names. While it is +possible to construct a model that follows these conventions, we don't provide +any additional help for this. More details can be found in +[TensorNames.cs](https://github.com/Unity-Technologies/ml-agents/blob/release_4_docs/com.unity.ml-agents/Runtime/Inference/TensorNames.cs) +and +[BarracudaModelParamLoader.cs](https://github.com/Unity-Technologies/ml-agents/blob/release_4_docs/com.unity.ml-agents/Runtime/Inference/BarracudaModelParamLoader.cs). + +If you wish to run inference on an externally trained model, you should use +Barracuda directly, instead of trying to run it through ML-Agents. + +## Model inference outside of Unity +We do not provide support for inference anywhere outside of Unity. The +`frozen_graph_def.pb` and `.onnx` files produced by training are open formats +for TensorFlow and ONNX respectively; if you wish to convert these to another +format or run inference with them, refer to their documentation. From 3484a79e6b8a0dafabe716128d27e098729a9410 Mon Sep 17 00:00:00 2001 From: Chris Elion Date: Wed, 5 Aug 2020 08:52:16 -0700 Subject: [PATCH 2/2] Update Unity-Inference-Engine.md --- docs/Unity-Inference-Engine.md | 2 +- 1 file changed, 1 insertion(+), 1 deletion(-) diff --git a/docs/Unity-Inference-Engine.md b/docs/Unity-Inference-Engine.md index 69f2b24649..a4adfee9a0 100644 --- a/docs/Unity-Inference-Engine.md +++ b/docs/Unity-Inference-Engine.md @@ -43,7 +43,7 @@ use for Inference. be faster than GPU. You should use the GPU only if you use the ResNet visual encoder or have a large number of agents with visual observations. -# Unsupported systems +# Unsupported use cases ## Externally trained models The ML-Agents Toolkit only supports the models created with our trainers. Model loading expects certain conventions for constants and tensor names. While it is