From 89d2d96392832d5be866ca9b39526d12231581eb Mon Sep 17 00:00:00 2001 From: Wolff Date: Tue, 1 Aug 2017 14:20:44 -0700 Subject: [PATCH 1/4] Add title to pick up TOC --- tensorflow_serving/g3doc/index.md | 2 ++ 1 file changed, 2 insertions(+) diff --git a/tensorflow_serving/g3doc/index.md b/tensorflow_serving/g3doc/index.md index d1869e0072f..18d0e28c42d 100644 --- a/tensorflow_serving/g3doc/index.md +++ b/tensorflow_serving/g3doc/index.md @@ -1,3 +1,5 @@ +# Introduction + TensorFlow Serving is a flexible, high-performance serving system for machine learning models, designed for production environments. TensorFlow Serving makes it easy to deploy new algorithms and experiments, while keeping the same From e1333bfd6c92744291d9579d3e914d4af452d76c Mon Sep 17 00:00:00 2001 From: Wolff Dobson Date: Thu, 3 Aug 2017 15:46:41 -0700 Subject: [PATCH 2/4] Fix linebreaks in links --- tensorflow_serving/g3doc/custom_source.md | 11 ++++++----- tensorflow_serving/g3doc/serving_advanced.md | 4 ++-- tensorflow_serving/g3doc/serving_basic.md | 9 +++++---- tensorflow_serving/g3doc/serving_inception.md | 4 ++-- 4 files changed, 15 insertions(+), 13 deletions(-) diff --git a/tensorflow_serving/g3doc/custom_source.md b/tensorflow_serving/g3doc/custom_source.md index fc8e802cb0c..fcef94d3408 100644 --- a/tensorflow_serving/g3doc/custom_source.md +++ b/tensorflow_serving/g3doc/custom_source.md @@ -23,11 +23,12 @@ loaders directly. Of course, whatever kind of data your source emits (whether it is POSIX paths, Google Cloud Storage paths, or RPC handles), there needs to be accompanying module(s) that are able to load servables based on that. Such modules are called -`SourceAdapters`. Creating a custom one is described in the [Custom -Servable](custom_servable.md) document. TensorFlow Serving comes with one for -instantiating TensorFlow sessions based on paths in file systems that TensorFlow -supports. One can add support for additional file systems to TensorFlow by -extending the `RandomAccessFile` abstraction (`tensorflow/core/public/env.h`). +`SourceAdapters`. Creating a custom one is described in the +[Custom Servable](custom_servable.md) document. TensorFlow Serving +comes with one for instantiating TensorFlow sessions based on paths +in file systems that TensorFlow supports. One can add support for +additional file systems to TensorFlow by extending the `RandomAccessFile` +abstraction (`tensorflow/core/public/env.h`). This document focuses on creating a source that emits paths in a TensorFlow-supported file system. It ends with a walk-through of how to use your diff --git a/tensorflow_serving/g3doc/serving_advanced.md b/tensorflow_serving/g3doc/serving_advanced.md index 1ddf2dc92c5..bb9c862b02c 100644 --- a/tensorflow_serving/g3doc/serving_advanced.md +++ b/tensorflow_serving/g3doc/serving_advanced.md @@ -35,8 +35,8 @@ Before getting started, please complete the [prerequisites](setup.md#prerequisites). Note: All `bazel build` commands below use the standard `-c opt` flag. To -further optimize the build, refer to the [instructions -here](setup.md#optimized-build). +further optimize the build, refer to the +[instructions here](setup.md#optimized-build). ## Train And Export TensorFlow Model diff --git a/tensorflow_serving/g3doc/serving_basic.md b/tensorflow_serving/g3doc/serving_basic.md index e3249f4ed8e..ed45aef1fd5 100644 --- a/tensorflow_serving/g3doc/serving_basic.md +++ b/tensorflow_serving/g3doc/serving_basic.md @@ -28,8 +28,8 @@ Before getting started, please complete the [prerequisites](setup.md#prerequisites). Note: All `bazel build` commands below use the standard `-c opt` flag. To -further optimize the build, refer to the [instructions -here](setup.md#optimized-build). +further optimize the build, refer to the +[instructions here](setup.md#optimized-build). ## Train And Export TensorFlow Model @@ -152,8 +152,9 @@ $>rm -rf /tmp/mnist_model If you would like to install the `tensorflow` and `tensorflow-serving-api` PIP packages, you can run all Python code (export and client) using a simple -`python` command. To install the PIP package, follow the [instructions -here](setup.md#tensorflow-serving-python-api-pip-package). It's also possible to +`python` command. To install the PIP package, follow the +[instructions here](setup.md#tensorflow-serving-python-api-pip-package). +It's also possible to use Bazel to build the necessary dependencies and run all code without installing those packages. The rest of the codelab will have instructions for both the Bazel and PIP options. diff --git a/tensorflow_serving/g3doc/serving_inception.md b/tensorflow_serving/g3doc/serving_inception.md index 81d89660f0a..661a8ce687b 100644 --- a/tensorflow_serving/g3doc/serving_inception.md +++ b/tensorflow_serving/g3doc/serving_inception.md @@ -36,8 +36,8 @@ $ docker run --name=inception_container -it $USER/tensorflow-serving-devel ### Clone, configure, and build TensorFlow Serving in a container Note: All `bazel build` commands below use the standard `-c opt` flag. To -further optimize the build, refer to the [instructions -here](setup.md#optimized-build). +further optimize the build, refer to the +[instructions here](setup.md#optimized-build). In the running container, we clone, configure and build TensorFlow Serving example code. From 57d8f6f7e2e62fad5aea33dca47b4a635153a0e7 Mon Sep 17 00:00:00 2001 From: Wolff Dobson Date: Thu, 3 Aug 2017 17:35:21 -0700 Subject: [PATCH 3/4] Fix some formatting issues; slight reordering --- tensorflow_serving/g3doc/leftnav_files | 1 + tensorflow_serving/g3doc/serving_basic.md | 59 ++++++++++++----------- 2 files changed, 31 insertions(+), 29 deletions(-) diff --git a/tensorflow_serving/g3doc/leftnav_files b/tensorflow_serving/g3doc/leftnav_files index d7d658c8677..272cbf0eba0 100644 --- a/tensorflow_serving/g3doc/leftnav_files +++ b/tensorflow_serving/g3doc/leftnav_files @@ -7,4 +7,5 @@ serving_advanced.md serving_inception.md custom_servable.md custom_source.md +signature_defs.md docker.md \ No newline at end of file diff --git a/tensorflow_serving/g3doc/serving_basic.md b/tensorflow_serving/g3doc/serving_basic.md index ed45aef1fd5..eba23f2b48b 100644 --- a/tensorflow_serving/g3doc/serving_basic.md +++ b/tensorflow_serving/g3doc/serving_basic.md @@ -112,35 +112,36 @@ You can add meta graph and variables to the builder using As an example for how `predict_signature` is defined, the util takes the following arguments: - * `inputs={'images': tensor_info_x}` specifies the input tensor info. - - * `outputs={'scores': tensor_info_y}` specifies the scores tensor info. - - Note that `tensor_info_x` and `tensor_info_y` have the structure of - `tensorflow::TensorInfo` protocol buffer defined [here](https://github.com/tensorflow/tensorflow/blob/master/tensorflow/core/protobuf/meta_graph.proto). - To easily build tensor infos, the TensorFlow SavedModel API also provides - [utils.py](https://github.com/tensorflow/tensorflow/blob/master/tensorflow/python/saved_model/utils.py), - with [related TensorFlow 1.0 API documentation](https://www.tensorflow.org/api_docs/python/tf/saved_model/utils). - - Also, note that `images` and `scores` are tensor alias names. They can be - whatever unique strings you want, and they will become the logical names - of tensor `x` and `y` that you refer to for tensor binding when sending - prediction requests later. - - For instance, if `x` refers to the tensor with name 'long_tensor_name_foo' - and `y` refers to the tensor with name 'generated_tensor_name_bar', - `builder` will store tensor logical name to real name mapping - ('images' -> 'long_tensor_name_foo') and ('scores' -> 'generated_tensor_name_bar'). - This allows the user to refer to these tensors with their logical names - when running inference. - - * `method_name` is the method used for the inference. For Prediction - requests, it should be set to `tensorflow/serving/predict`. For other - method names, see [signature_constants.py](https://github.com/tensorflow/tensorflow/blob/master/tensorflow/python/saved_model/signature_constants.py) - and related [TensorFlow 1.0 API documentation](https://www.tensorflow.org/api_docs/python/tf/saved_model/signature_constants). - - In addition to the description above, documentation related to signature def - structure and how to set up them up can be found [here](https://github.com/tensorflow/serving/blob/master/tensorflow_serving/g3doc/signature_defs.md). + * `inputs={'images': tensor_info_x}` specifies the input tensor info. + + * `outputs={'scores': tensor_info_y}` specifies the scores tensor info. + + * `method_name` is the method used for the inference. For Prediction + requests, it should be set to `tensorflow/serving/predict`. For other + method names, see [signature_constants.py](https://github.com/tensorflow/tensorflow/blob/master/tensorflow/python/saved_model/signature_constants.py) + and related [TensorFlow 1.0 API documentation](https://www.tensorflow.org/api_docs/python/tf/saved_model/signature_constants). + + +Note that `tensor_info_x` and `tensor_info_y` have the structure of +`tensorflow::TensorInfo` protocol buffer defined [here](https://github.com/tensorflow/tensorflow/blob/master/tensorflow/core/protobuf/meta_graph.proto). +To easily build tensor infos, the TensorFlow SavedModel API also provides +[utils.py](https://github.com/tensorflow/tensorflow/blob/master/tensorflow/python/saved_model/utils.py), +with [related TensorFlow 1.0 API documentation](https://www.tensorflow.org/api_docs/python/tf/saved_model/utils). + +Also, note that `images` and `scores` are tensor alias names. They can be +whatever unique strings you want, and they will become the logical names +of tensor `x` and `y` that you refer to for tensor binding when sending +prediction requests later. + +For instance, if `x` refers to the tensor with name 'long_tensor_name_foo' and +`y` refers to the tensor with name 'generated_tensor_name_bar', `builder` will +store tensor logical name to real name mapping ('images' -> +'long_tensor_name_foo') and ('scores' -> 'generated_tensor_name_bar'). This +allows the user to refer to these tensors with their logical names when +running inference. + +Note: In addition to the description above, documentation related to signature +def structure and how to set up them up can be found [here](signature_defs.md). Let's run it! From 8a883026bd2dcc83e83b136604e88ea4939262f4 Mon Sep 17 00:00:00 2001 From: Wolff Dobson Date: Fri, 4 Aug 2017 15:02:57 -0700 Subject: [PATCH 4/4] Fix rendering of numbered lists with shell commands --- tensorflow_serving/g3doc/setup.md | 16 ++++++++-------- 1 file changed, 8 insertions(+), 8 deletions(-) diff --git a/tensorflow_serving/g3doc/setup.md b/tensorflow_serving/g3doc/setup.md index b59a61b5eec..cffd48103ed 100644 --- a/tensorflow_serving/g3doc/setup.md +++ b/tensorflow_serving/g3doc/setup.md @@ -17,17 +17,17 @@ following steps: Let's say you downloaded bazel-0.4.5-installer-linux-x86_64.sh. You would execute: - ```shell +
     cd ~/Downloads
     chmod +x bazel-0.4.5-installer-linux-x86_64.sh
     ./bazel-0.4.5-installer-linux-x86_64.sh --user
-    ```
+    
2. Set up your environment. Put this in your ~/.bashrc. - ```shell +
     export PATH="$PATH:$HOME/bin"
-    ```
+    
### gRPC @@ -91,17 +91,17 @@ sudo apt-get remove tensorflow-model-server 1. Add TensorFlow Serving distribution URI as a package source (one time setup) - ```shell +
     echo "deb [arch=amd64] http://storage.googleapis.com/tensorflow-serving-apt stable tensorflow-model-server tensorflow-model-server-universal" | sudo tee /etc/apt/sources.list.d/tensorflow-serving.list
 
     curl https://storage.googleapis.com/tensorflow-serving-apt/tensorflow-serving.release.pub.gpg | sudo apt-key add -
-    ```
+    
2. Install and update TensorFlow ModelServer - ```shell +
     sudo apt-get update && sudo apt-get install tensorflow-model-server
-    ```
+    
Once installed, the binary can be invoked using the command `tensorflow_model_server`.