This demo illustrates working with TensorFlow using an Amazon Deep Learning AMI (DLAMI). It includes:
src/basics.py
- Understand basic operations in TensorFlowsrc/nn.py
- Create a small neural network regression model in TensorFlowsrc/nn_with_summaries.py
- Show how to augment code with TensorFlow summaries to visualize the graph and learning process in TensorBoardsrc/nn_export.py
- Illustrates how to save a TensorFlow model to disk so that it can be served by TensorFlow Servingsrc/nn_client.py
- Example of how to consume the model served by TensorFlow Serving
Deploy the CloudFormation stack in the template in infrastructure/
. The template creates a user with the following credentials and minimal required permisisons to complete the Lab:
- Username: student
- Password: password
- Connect to the instance using the SSH username: ubuntu.
- Run the Jupyter notebook server that comes pre-installed on the Amazon Deep Learning AMI:
jupyter notebook
- SSH tunnel to the notebook server running on port 8888
- Open a browser to the notebook server on localhost. Get the URL with token from the command
jupyter notebook list
- Create a new python 2.7 and TensorFlow environment notebook for each file in the
src/
directory - Paste the code in from each script in the
src/
directory into a cell - Run the notebooks
- To view the summaries of
src/nn_with_summaries.py
in TensorBoard, run the command:tensorboard --logdir /tmp/tensorflow/nn
- To serve the model saved by
src/nn_export.py
with TensorFlow Serving, run the command:tensorflow_model_server --port=9000 --model_name=nn --model_base_path=/tmp/nn
- To view the summaries of
Delete the CloudFormation stack to remove all the resources. No resources are created outside of those created by the template.