Skip to content

Commit 122d729

Browse files
committed
Added workshop materials
1 parent 740a2bc commit 122d729

File tree

625 files changed

+264290
-6
lines changed

Some content is hidden

Large Commits have some content hidden by default. Use the searchbox below for content that may be hidden.

625 files changed

+264290
-6
lines changed
Lines changed: 5 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,5 @@
1+
**/.DS_Store
2+
.ipynb_checkpoints/*
3+
**/package-lock.json
4+
**/node_modules
5+
**/.serverless
Lines changed: 15 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,15 @@
1+
# AWS Serverless for Deep Learning First Steps
2+
3+
![Powered by Jupyter Logo](https://cdn.oreillystatic.com/images/icons/powered_by_jupyter.png)
4+
5+
This project contains the Jupyter Notebooks and supporting files for _AWS Serverless for Deep Learning First Steps_ with Rustem Feyzkhanov.
6+
7+
These notebooks can be run on the O'Reilly Learning Platform [here](https://learning.oreilly.com/jupyter-notebooks/~/9781492088400).
8+
9+
It contains both the exercises (/notebooks), possibly the solutions (/solutions), as well as any data or files needed (/data).
10+
11+
This is a public repository so there is no need to create an account to download its contents. To download the source code from this page, click the 'Cloud' icon on the top right hand, above where the latest commit is detailed.
12+
13+
To download via git from your preferred terminal application, type:
14+
15+
```git clone https://resources.oreilly.com/binderhub/aws-serverless-for-deep-learning-first-steps```
Lines changed: 218 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,218 @@
1+
{
2+
"cells": [
3+
{
4+
"cell_type": "markdown",
5+
"metadata": {},
6+
"source": [
7+
"# Deep learning training pipeline\n",
8+
"# Deploying AWS Step Functions + AWS Batch + AWS Lambda"
9+
]
10+
},
11+
{
12+
"cell_type": "markdown",
13+
"metadata": {},
14+
"source": [
15+
"## Installing dependencies\n",
16+
"Here we install relevant dependencies to run serverless framework."
17+
]
18+
},
19+
{
20+
"cell_type": "code",
21+
"execution_count": null,
22+
"metadata": {},
23+
"outputs": [],
24+
"source": [
25+
"!pip install awscli --upgrade --user\n",
26+
"!npm install -g serverless@1.77.0"
27+
]
28+
},
29+
{
30+
"cell_type": "markdown",
31+
"metadata": {},
32+
"source": [
33+
"## Setting AWS environmental variables\n",
34+
"Here we set up AWS environmental variables so that we will be able to deploy to our AWS account. We will need access key id, secret access key and account id. You will need to replace AWS_ACCESS_KEY_ID, AWS_SECRET_ACCESS_KEY, AWS_ACCOUNT_ID with your values. Please use test account and temporary credentials or deactivate credentials after usage."
35+
]
36+
},
37+
{
38+
"cell_type": "code",
39+
"execution_count": null,
40+
"metadata": {},
41+
"outputs": [],
42+
"source": [
43+
"%env AWS_ACCESS_KEY_ID=<AWS_ACCESS_KEY_ID>\n",
44+
"%env AWS_SECRET_ACCESS_KEY=<AWS_SECRET_ACCESS_KEY>\n",
45+
"%env AWS_ACCOUNT_ID=<AWS_ACCOUNT_ID>\n",
46+
"%env AWS_DEFAULT_REGION=us-east-1"
47+
]
48+
},
49+
{
50+
"cell_type": "markdown",
51+
"metadata": {},
52+
"source": [
53+
"## Creating role for AWS Batch"
54+
]
55+
},
56+
{
57+
"cell_type": "code",
58+
"execution_count": null,
59+
"metadata": {},
60+
"outputs": [],
61+
"source": [
62+
"!aws iam create-role --role-name AWSBatchServiceRole --assume-role-policy-document file://assume-batch-policy.json\n",
63+
"!aws iam attach-role-policy --role-name AWSBatchServiceRole --policy-arn arn:aws:iam::aws:policy/service-role/AWSBatchServiceRole"
64+
]
65+
},
66+
{
67+
"cell_type": "markdown",
68+
"metadata": {},
69+
"source": [
70+
"# Deploying deep learning pipeline to AWS"
71+
]
72+
},
73+
{
74+
"cell_type": "markdown",
75+
"metadata": {},
76+
"source": [
77+
"## Deploying CPU pipeline to AWS\n",
78+
"Deploying stack with AWS Batch (CPU) + AWS Step Functions + AWS Lambda. At the end of the deployment it will produce endpoint which we can call to trigger AWS Step Functions. AWS Batch will use publicly available CPU image [ryfeus/serverless-for-deep-learning:cpu](https://hub.docker.com/repository/docker/ryfeus/serverless-for-deep-learning/general)"
79+
]
80+
},
81+
{
82+
"cell_type": "code",
83+
"execution_count": null,
84+
"metadata": {},
85+
"outputs": [],
86+
"source": [
87+
"%env IMAGE_NAME=ryfeus/serverless-for-deep-learning:cpu\n",
88+
"%env S3_BUCKET=serverless-for-deep-learning\n",
89+
"%env INSTANCE_TYPE=EC2\n",
90+
"!cd deep-learning-training-cpu;npm install\n",
91+
"!cd deep-learning-training-cpu;serverless deploy"
92+
]
93+
},
94+
{
95+
"cell_type": "markdown",
96+
"metadata": {},
97+
"source": [
98+
"## Deploying GPU pipeline to AWS\n",
99+
"Deploying stack with AWS Batch (GPU) + AWS Step Functions + AWS Lambda. At the end of the deployment it will produce endpoint which we can call to trigger AWS Step Functions. AWS Batch will use publicly available GPU image [ryfeus/serverless-for-deep-learning:latest](https://hub.docker.com/repository/docker/ryfeus/serverless-for-deep-learning/general)"
100+
]
101+
},
102+
{
103+
"cell_type": "code",
104+
"execution_count": null,
105+
"metadata": {},
106+
"outputs": [],
107+
"source": [
108+
"%env IMAGE_NAME=ryfeus/serverless-for-deep-learning:latest\n",
109+
"%env S3_BUCKET=serverless-for-deep-learning\n",
110+
"%env INSTANCE_TYPE=EC2\n",
111+
"!cd deep-learning-training-gpu;npm install\n",
112+
"!cd deep-learning-training-gpu;serverless deploy"
113+
]
114+
},
115+
{
116+
"cell_type": "markdown",
117+
"metadata": {},
118+
"source": [
119+
"## Calling endpoint fron previous cell\n",
120+
"Here we can call endpoint from previous cell which will trigger Step Functions with AWS Lambda and AWS Batch."
121+
]
122+
},
123+
{
124+
"cell_type": "code",
125+
"execution_count": null,
126+
"metadata": {},
127+
"outputs": [],
128+
"source": [
129+
"%env ENDPOINT_URL=\n",
130+
"!curl $ENDPOINT_URL"
131+
]
132+
},
133+
{
134+
"cell_type": "markdown",
135+
"metadata": {},
136+
"source": [
137+
"## Listing current executions and their state\n",
138+
"Here we list all current Step Function executions related to deployed AWS Step Functions. We will be able to see execution which was created from the request to the endpoint."
139+
]
140+
},
141+
{
142+
"cell_type": "code",
143+
"execution_count": null,
144+
"metadata": {},
145+
"outputs": [],
146+
"source": [
147+
"%env STATE_MACHINE_NAME=DeepLearningTrainingCPU-StepFunction\n",
148+
"!aws stepfunctions list-executions --state-machine-arn arn:aws:states:$AWS_DEFAULT_REGION:$AWS_ACCOUNT_ID:stateMachine:$STATE_MACHINE_NAME --query 'executions[*].[name,status]' --output text "
149+
]
150+
},
151+
{
152+
"cell_type": "markdown",
153+
"metadata": {},
154+
"source": [
155+
"## Check specific execution state\n",
156+
"Based on the results from the previous cell we can choose execution id and get its current graph state. You will need to replace <EXECUTION_ID> with execution id for which you would want to get the state."
157+
]
158+
},
159+
{
160+
"cell_type": "code",
161+
"execution_count": null,
162+
"metadata": {},
163+
"outputs": [],
164+
"source": [
165+
"%env STATE_MACHINE_NAME=DeepLearningTrainingCPU-StepFunction\n",
166+
"%env EXECUTION_ID=\n",
167+
"!aws stepfunctions describe-state-machine-for-execution --execution-arn arn:aws:states:$AWS_DEFAULT_REGION:$AWS_ACCOUNT_ID:execution:$STATE_MACHINE_NAME:$EXECUTION_ID --output text --query 'definition'"
168+
]
169+
},
170+
{
171+
"cell_type": "markdown",
172+
"metadata": {},
173+
"source": [
174+
"## Removing CPU or GPU application\n",
175+
"Finally we can run the following command to remove infrastructure which we've just created."
176+
]
177+
},
178+
{
179+
"cell_type": "code",
180+
"execution_count": null,
181+
"metadata": {},
182+
"outputs": [],
183+
"source": [
184+
"!cd deep-learning-training-cpu;serverless remove"
185+
]
186+
},
187+
{
188+
"cell_type": "code",
189+
"execution_count": null,
190+
"metadata": {},
191+
"outputs": [],
192+
"source": [
193+
"!cd deep-learning-training-gpu;serverless remove"
194+
]
195+
}
196+
],
197+
"metadata": {
198+
"kernelspec": {
199+
"display_name": "Python 3",
200+
"language": "python",
201+
"name": "python3"
202+
},
203+
"language_info": {
204+
"codemirror_mode": {
205+
"name": "ipython",
206+
"version": 3
207+
},
208+
"file_extension": ".py",
209+
"mimetype": "text/x-python",
210+
"name": "python",
211+
"nbconvert_exporter": "python",
212+
"pygments_lexer": "ipython3",
213+
"version": "3.7.7"
214+
}
215+
},
216+
"nbformat": 4,
217+
"nbformat_minor": 4
218+
}

0 commit comments

Comments
 (0)