Skip to content
Amazon SageMaker Script Mode examples
Jupyter Notebook Python Shell Dockerfile
Branch: master
Clone or download
Permalink
Type Name Latest commit message Commit time
Failed to load latest commit information.
.github Creating initial file from template Feb 4, 2019
keras-embeddings-script-mode
tf-batch-inference-script update to tf 1.13 etc. Jul 24, 2019
tf-distribution-options training curves for Horovod, update to tf 1.13 Jul 7, 2019
tf-eager-script-mode
tf-horovod-inference-pipeline for inference pipeline version, fix instance type for model creation … May 29, 2019
tf-sentiment-script-mode fix for numpy version incompatability with keras datasets Sep 4, 2019
CODE_OF_CONDUCT.md Creating initial file from template Feb 4, 2019
CONTRIBUTING.md Creating initial file from template Feb 4, 2019
LICENSE Creating initial file from template Feb 4, 2019
NOTICE Creating initial file from template Feb 4, 2019
README.md update example description Jul 24, 2019

README.md

Amazon SageMaker Script Mode Examples

This repository contains examples and related resources regarding Amazon SageMaker’s Script Mode. With Script Mode, you can use training scripts similar to those you would use outside SageMaker with SageMaker's prebuilt containers for various deep learning frameworks such TensorFlow, PyTorch, and Apache MXNet.

Currently this repository has the following resources:

  • TensorFlow Eager Execution: This example shows how to use Script Mode with TensorFlow's Eager Execution mode. Eager Execution is the future of TensorFlow, and a major paradigm shift. Introduced as a more intuitive and dynamic alternative to the original graph mode of TensorFlow, Eager Execution is the default mode of TensorFlow 2. PREREQUISITES: Be sure to upload all files in the tf-eager-script-mode directory (including the subdirectory train_model) to the directory where you will run the related Jupyter notebook.

  • TensorFlow Sentiment Analysis: Script Mode is used with TensorFlow's implementation of the Keras API for a sentiment analysis task. In addition to demonstrating Local Mode training for testing your code, this example also shows usage of SageMaker Batch Transform for asynchronous, large scale inference, rather than SageMaker Hosted Endpoints for near real-time inference. PREREQUISITES: Be sure to upload all files in the tf-sentiment-script-mode directory to the directory where you will run the related Jupyter notebook.

  • TensorFlow Text Classification with Word Embeddings: In this example, TensorFlow's tf.keras API is used with Script Mode for a text classification task. An important aspect of the example is showing how to load preexisting word embeddings such as GloVe in Script Mode. Other features demonstrated include Local Mode endpoints as well as Local Mode training. PREREQUISITES: (1) Use a GPU-based (P3 or P2) SageMaker notebook instance, and (2) be sure to upload all files in the keras-embeddings-script-mode directory (including subdirectory code) to the directory where you will run the related Jupyter notebook.

  • TensorFlow Distributed Training Options: This example demonstrates distributed training options in Script Mode, including the use of parameter servers and Horovod. PREREQUISITES: be sure to upload all files in the tf-distribution-options directory (including the subdirectory code and files) to the directory where you will run the related Jupyter notebook.

  • TensorFlow Highly Performant Batch Inference & Training: The focus of this example is highly performant batch inference using TensorFlow Serving, along with Horovod distributed training. To transform the input image data for inference, a preprocessing script is used with the Amazon SageMaker TensorFlow Serving container. PREREQUISITES: be sure to upload all files in the tf-batch-inference-script directory (including the subdirectory code and files) to the directory where you will run the related Jupyter notebook.

  • TensorFlow with Horovod & Inference Pipeline: Script Mode with TensorFlow is used for a computer vision task, in a demonstration of Horovod distributed training and doing batch inference in conjunction with an Inference Pipeline for transforming image data before inputting it to the model container. This is an alternative to the previous example, which uses a preprocessing script with the Amazon SageMaker TensorFlow Serving Container rather than an Inference Pipeline. PREREQUISITES: be sure to upload all files in the tf-horovod-inference-pipeline directory (including the subdirectory code and files) to the directory where you will run the related Jupyter notebook.

License

The contents of this repository are licensed under the Apache 2.0 License except where otherwise noted.

You can’t perform that action at this time.