Skip to content
Branch: master
Find file History
kwwaikar and yuanzhua Marketplace (#906)
* adding two sample notebooks

* updating readme

* added transformer.wait() statement to Automl notebook

* renamed arn provider files, incorporated spelling fixes

* indenting code

* incorporated review comments - 1. removed redundant imports 2.Changed the way of constructing S3 path for model and batch transform output. 3. Fixed the title of the notebook

* Adding a sample notebook that demonstrates models for improving workplace safety

* removing the obsolete reference
Latest commit 6461be7 Oct 16, 2019
Permalink
Type Name Latest commit message Commit time
..
Failed to load latest commit information.
data_distribution_types Add parameters tag to cells require value from user (#800) Jul 15, 2019
distributed_tensorflow_mask_rcnn Distributed TensorFlow Mask-RCNN Pull (#891) Oct 4, 2019
fairseq_translation add ingore-installed option to suppress can't uninstall distutil PyYA… Apr 27, 2019
fastai_oxford_pets updated fastai example Apr 29, 2019
handling_kms_encrypted_data Add parameters tag to cells require value from user (#800) Jul 15, 2019
inference_pipeline_sparkml_blazingtext_dbpedia Change SparkML kernel to CondaPy2. (#488) Nov 28, 2018
inference_pipeline_sparkml_xgboost_abalone Change SparkML kernel to CondaPy2. (#488) Nov 28, 2018
inference_pipeline_sparkml_xgboost_car_evaluation Add inference pipeline sparkml xgboost car evaluation example (#549) Dec 27, 2018
kmeans_bring_your_own_model Update Sample Notebooks. (#872) Sep 6, 2019
mxnet_mnist_byom notes on gpu/cpu changes. Dec 15, 2017
parquet_to_recordio_protobuf update notebooks with python sdk get_image_url method (#1) Jul 4, 2018
pipe_bring_your_own Rename the BYO R and Pipe Example image with sagemaker prefix Jan 23, 2019
pytorch_extending_our_containers Update setup.sh file to skip if the user doesn't have root access Feb 14, 2019
r_bring_your_own Rename the BYO R and Pipe Example image with sagemaker prefix Jan 23, 2019
r_kernel Using R With Amazon SageMaker sample notebook (#797) Jul 9, 2019
scikit_bring_your_own Fix scikit bring-your-own local testing with Batch IO joining. (#771) Jul 30, 2019
search Search beta flag removal (#774) Jun 20, 2019
tensorflow_bring_your_own Update setup.sh file to skip if the user doesn't have root access Feb 14, 2019
tensorflow_iris_byom Specify the framework_version in tensorflow_iris_byom Mar 26, 2019
working_with_redshift_data resolving the Redshift DNS name to a private IP Jan 23, 2018
xgboost_bring_your_own_model Update Sample Notebooks. (#872) Sep 6, 2019
README.md Marketplace (#906) Oct 16, 2019

README.md

Amazon SageMaker Examples

Advanced Amazon SageMaker Functionality

These examples that showcase unique functionality available in Amazon SageMaker. They cover a broad range of topics and will utilize a variety of methods, but aim to provide the user with sufficient insight or inspiration to develop within Amazon SageMaker.

  • Data Distribution Types showcases the difference between two methods for sending data from S3 to Amazon SageMaker Training instances. This has particular implication for scalability and accuracy of distributed training.
  • Encrypting Your Data shows how to use Server Side KMS encrypted data with Amazon SageMaker training. The IAM role used for S3 access needs to have permissions to encrypt and decrypt data with the KMS key.
  • Using Parquet Data shows how to bring Parquet data sitting in S3 into an Amazon SageMaker Notebook and convert it into the recordIO-protobuf format that many SageMaker algorithms consume.
  • Connecting to Redshift demonstrates how to copy data from Redshift to S3 and vice-versa without leaving Amazon SageMaker Notebooks.
  • Bring Your Own XGBoost Model shows how to use Amazon SageMaker Algorithms containers to bring a pre-trained model to a realtime hosted endpoint without ever needing to think about REST APIs.
  • Bring Your Own k-means Model shows how to take a model that's been fit elsewhere and use Amazon SageMaker Algorithms containers to host it.
  • Installing the R Kernel shows how to install the R kernel into an Amazon SageMaker Notebook Instance.
  • Bring Your Own R Algorithm shows how to bring your own algorithm container to Amazon SageMaker using the R language.
  • Bring Your Own scikit Algorithm provides a detailed walkthrough on how to package a scikit learn algorithm for training and production-ready hosting.
  • Bring Your Own MXNet Model shows how to bring a model trained anywhere using MXNet into Amazon SageMaker
  • Bring Your Own TensorFlow Model shows how to bring a model trained anywhere using TensorFlow into Amazon SageMaker
  • Inference Pipeline with SparkML and XGBoost shows how to deploy an Inference Pipeline with SparkML for data pre-processing and XGBoost for training on the Abalone dataset. The pre-processing code is written once and used between training and inference.
  • Inference Pipeline with SparkML and BlazingText shows how to deploy an Inference Pipeline with SparkML for data pre-processing and BlazingText for training on the DBPedia dataset. The pre-processing code is written once and used between training and inference.
You can’t perform that action at this time.