Skip to content
Permalink
Browse files

Update deployment readme for sagemaker and lambda (#352)

* update deploy with serverless guide

* fix typo and update sagemaker deployment readme

* fix typo

* remove list docker image screenshot
  • Loading branch information...
yubozhao authored and parano committed Oct 18, 2019
1 parent c70166d commit 84a0b8925e2c9a24f9a846e83d68db6b1020f89f
@@ -291,7 +291,7 @@ def apply(self, deployment_pb, yatai_service, prev_deployment=None):
sagemaker_client = boto3.client('sagemaker', sagemaker_config.region)

with TempDirectory() as temp_dir:
sagemaker_project_dir = os.path.jon(
sagemaker_project_dir = os.path.join(
temp_dir, deployment_spec.bento_name
)
init_sagemaker_project(sagemaker_project_dir, bento_path)
@@ -25,20 +25,20 @@ After you exported your model with BentoML, you can invoke `bentoml deployment c
Update `BENTO_NAME` and `BENTO_VERSION` with your saved BentoML service's inforamtion and run the following command

```bash
bentoml deployment create my-sagemaker-deployment --bento BENTO_NAME:BENTO_VERSION --platform=aws-sagemaker --region=AWS_REGION --api-name=predict
bentoml deployment create sentiment-sagemaker --bento BENTO_NAME:BENTO_VERSION --platform=aws-sagemaker --region=AWS_REGION --api-name=predict
```

After you invoke the command, BentoML will first generated a snapshot of this model archive in your local file system with additional files for AWS SageMaker.
![ScreenShot](./deploying-sagemaker.png)

After you invoke deploy command, BentoML will perform serveal operations for SageMaker deployment.
After you invoke the command, BentoML will first generated a snapshot of this model archive in your local file system with files for AWS SageMaker.

After you invoke deploy command, BentoML will perform few operations for SageMaker deployment.

BentoML will use docker to build an image with the snapshot archive and then push the image to AWS ECR(Elasitc Container Registry)

BentoML will build a docker image from the snapshot and then push the image to AWS ECR. If you run `docker images` command in your terminal. You will see the built docker image.
You can also expect to see the same image in your AWS ECR dashboard.

![ScreenShot](./docker-image.png)

![ScreenShot](./aws-ecr.png)

Base on the docker image in AWS ECR. BentoML will create model and endpoint configuration in AWS SageMaker.
@@ -56,8 +56,6 @@ BentoML will create SageMaker endpoint base on the endpoint configuration we cre

To test the newly deployed model. We can use AWS cli to make prediction. The result will be stored as JSON format in an output file.

![ScreenShot](./test-prediction.png)

You can invoke the example model with following command.

```bash
@@ -68,16 +66,22 @@ aws sagemaker-runtime invoke-endpoint \
output.json
```

![ScreenShot](./test-prediction.png)

## Check deployment status

```bash
bentoml deployment describe my-sagemaker-deployment
```

![ScreenShot](./describe-deployment.png)

## Delete deployment

Delete a SageMaker deployment is as easy as deploying it.

```bash
bentoml deployment delete my-sagemaker-deployment
```

![ScreenShot](./delete-deployment.png)
Binary file not shown.
Binary file not shown.
Binary file not shown.
Binary file not shown.
Binary file not shown.
Binary file not shown.
Binary file not shown.
@@ -6,12 +6,7 @@ Cloud providers offer serverless computing service to help teams deploy a
scalable services without worry about hardware configuration and maintains. This
benefit may also apply to machine learning as well.

In this example, we will train a sentiment analysis model with SciKit-learn, and
then use BentoML to create serverless archive and deploy to AWS lambda service
with the Serverless framework.

We will exame the benefits and challenges of deploying machine learning services to *serverless* services.
And then, we will use BentoML to deploy the trained model to AWS Lambda with one command, and take a closer look of the entire process.
In this example, we will train a sentiment analysis model with SciKit-learn, and deploy to AWS lambda service, and then take a closer look of other operations with deployed service.

## Developing sentiment analysis model

@@ -21,7 +16,6 @@ We will train a scikit-learn model, and then we will use BentoML to package it.
## Prerequisites for deploying model to AWS Lambda

* Install Node.JS. Follow the instructions on [Nodejs.org](https://nodejs.org/en)
* Install Serverless framework. You can find instructions [here](https://serverless.com/framework/docs/getting-started/)
* AWS account configured on your machine
1. Install AWS CLI. [Instructions](https://docs.aws.amazon.com/cli/latest/userguide/cli-chap-install.html)
2. Configuring with your AWS account. [Instructions](https://docs.aws.amazon.com/cli/latest/userguide/cli-chap-configure.html)
@@ -31,11 +25,12 @@ We will train a scikit-learn model, and then we will use BentoML to package it.
It is simple to deploy to AWS Lambda with BentoML. After you saved your model as BentoML bundle, you invoke a single command.

```bash
bentoml deployment create my-serverless-deployment --bento BENTO_NAME:BENTO_VERSION --platform aws-lambda --region us-west-2
bentoml deployment create sentiment-serverless --bento BENTO_NAME:BENTO_VERSION --platform aws-lambda --region us-west-2
```
![ScreenShot](./deploying-to-lambda.png)

#### What happens after the deploy command
BentoML does serveal things under the hood that help data scientists to deploy their model services to AWS lambda.
BentoML perform serval actions under the hood that help data scientists to deploy their model services to AWS lambda.

BentoML will invoke AWS to create different services. Those services managed by CloudFormation. When the process complete, you will see the Lambda function in service.

@@ -44,8 +39,7 @@ BentoML will invoke AWS to create different services. Those services managed by
Finally, you will see the Lambda function show up on your AWS Dashboard.
![ScreenShot](./lambda-dash.png)

To make a prediction request, you can use CURL command. Copy and paste the following command and update `data` and `url`
base on your deployment
To make a prediction request, you can use CURL command. Copy and paste the following command and update `data` base on your deployment, and copy the endpoint in deployment result as `url`

```bash
curl -i \
@@ -62,9 +56,13 @@ https://URL
bentoml deployment describe my-serverless-deployment
```

![ScreenShot](./describe-deployment.png)


## Delete deployment from AWS lambda
Delete deployment from AWS lambda is as simple as deploy it. To delete deployment use `bentoml delete-deployment` command.
```bash
bentoml deployment delete my-serverless-deployment
```
```

![ScreenShot](./delete-deployment.png)
Binary file not shown.
Binary file not shown.
@@ -272,7 +272,7 @@
"## Check deployment status\n",
"\n",
"```\n",
"bentoml deployment get DEPLOYMENT_NAME --namespace=NAMESPACE \n",
"bentoml deployment describe DEPLOYMENT_NAME --namespace=NAMESPACE \n",
"```\n",
"\n",
"### Arguments:\n",
@@ -288,7 +288,7 @@
"metadata": {},
"outputs": [],
"source": [
"!bentoml deployment get sentiment-serverless"
"!bentoml deployment describe sentiment-serverless"
]
},
{
@@ -316,13 +316,6 @@
"source": [
"!bentoml deployment delete sentiment-serverless"
]
},
{
"cell_type": "code",
"execution_count": null,
"metadata": {},
"outputs": [],
"source": []
}
],
"metadata": {
@@ -341,7 +334,7 @@
"name": "python",
"nbconvert_exporter": "python",
"pygments_lexer": "ipython3",
"version": "3.7.3"
"version": "3.7.2"
}
},
"nbformat": 4,
Binary file not shown.
Binary file not shown.
Binary file not shown.
Binary file not shown.
Binary file not shown.

0 comments on commit 84a0b89

Please sign in to comment.
You can’t perform that action at this time.