Skip to content
Merged
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
4 changes: 4 additions & 0 deletions content/authors.en.md
Original file line number Diff line number Diff line change
Expand Up @@ -14,6 +14,10 @@ weight: 100
1. Daniel Yoder ([danielsyoder](https://github.com/danielsyoder)) - The brains behind amazon-dynamodb-labs.com and the co-creator of the design scenarios

### 2025 additions
zETL Workshop update with OS pipeline changes (October 2025):
1. John Terhune - ([@terhunej](https://github.com/terhunej)) - Primary author
2. Esteban Serna - ([@tebanieo](https://github.com/tebanieo)) - Editor, Tech reviewer and merger.

Removing Cloud9 due to End of Life from all the workshops (October 2025):
1. Esteban Serna ([@tebanieo](https://github.com/tebanieo)) - Primary author, and merger

Expand Down
13 changes: 6 additions & 7 deletions content/change-data-capture/overview/create-tables.en.md
Original file line number Diff line number Diff line change
Expand Up @@ -9,12 +9,12 @@ In this section you create the DynamoDB tables you will use during the labs for

In the commands below, the **create-table** AWS CLI command is used to create two new tables called Orders and OrdersHistory.

It will create the Orders table in provisioned capacity mode to have 5 read capacity units (RCU), 5 write capacity uints (WCU) and a partition key named `id`.
It will create the Orders table in on-demand capacity mode with a partition key named `id`.

It will also create the OrdersHistory table in provisioned capacity mode to have 5 RCU, 5 WCU, a partition key named `pk` and a sort key named `sk`.
It will also create the OrdersHistory table in on-demand capacity mode with a partition key named `pk` and a sort key named `sk`.

* Copy the **create-table** commands below and paste them into your command terminal.
* Execute the commands to to create two tables named Orders and OrdersHistory.
* Execute the commands to to create two tables named `Orders` and `OrdersHistory`.

```bash
aws dynamodb create-table \
Expand All @@ -23,8 +23,7 @@ aws dynamodb create-table \
AttributeName=id,AttributeType=S \
--key-schema \
AttributeName=id,KeyType=HASH \
--provisioned-throughput \
ReadCapacityUnits=5,WriteCapacityUnits=5 \
--billing-mode PAY_PER_REQUEST \
--query "TableDescription.TableStatus"

aws dynamodb create-table \
Expand All @@ -35,9 +34,9 @@ aws dynamodb create-table \
--key-schema \
AttributeName=pk,KeyType=HASH \
AttributeName=sk,KeyType=RANGE \
--provisioned-throughput \
ReadCapacityUnits=5,WriteCapacityUnits=5 \
--billing-mode PAY_PER_REQUEST \
--query "TableDescription.TableStatus"

```

Run the command below to confirm that both tables have been created.
Expand Down
11 changes: 5 additions & 6 deletions content/change-data-capture/setup/aws-ws-event.en.md
Original file line number Diff line number Diff line change
Expand Up @@ -7,30 +7,29 @@ chapter: true

### Login to AWS Workshop Studio Portal

1. If you are provided a one-click join link, skip to step 3.
1. If you are provided a one-click join link, use it and skip to step 3.

2. Visit [https://catalog.us-east-1.prod.workshops.aws](https://catalog.us-east-1.prod.workshops.aws). If you attended any other workshop earlier on this portal, please logout first. Click on **Get Started** on the right hand side of the window.

![Workshop Studio Landing Page](/static/images/aws-ws-event1.png)

3. On the next, **Sign in** page, choose **Email One-Time Passcode (OTP)** to sign in to your workshop page.

![Sign in page](/static/images/aws-ws-event2.png)

4. Provide an email address to receive a one-time passcode.

![Email address input](/static/images/aws-ws-event3.png)

5. Enter the passcode that you received in the provided email address, and click **Sign in**.

6. Next, in the textbox, enter the event access code (eg: abcd-012345-ef) that you received from the event facilitators. If you are provided a one-click join link, you will be redirected to the next step automatically.

![Event access code](/static/images/aws-ws-event4.png)

7. Select on **I agree with the Terms and Conditions** on the bottom of the next page and click **Join event** to continue to the event dashboard.

8. On the event dashboard, click on **Open AWS console** to federate into AWS Management Console in a new tab. On the same page, click **Get started** to open the workshop instructions.
![Event dashboard](/static/images/common/workshop-studio-01.png)

9. In addition to the AWS console you should open your Visual Studio code server, by clicking in the `VSCodeServerURL` parameter, available from the "Event Outputs" section. When prompted for a password use the value from `VSCodeServerPassword`.

![Event dashboard](/static/images/aws-ws-event5.png)
![Event dashboard](/static/images/common/workshop-studio-02.png)

Now that you are set up, continue on to: :link[2. Scenario Overview]{href="/change-data-capture/overview"}.
2 changes: 1 addition & 1 deletion content/change-data-capture/setup/index.en.md
Original file line number Diff line number Diff line change
Expand Up @@ -14,7 +14,7 @@ To run this lab, you will need an AWS account, and a user identity with access t
* Amazon Kinesis
* AWS Lambda
* Amazon Simple Queue Service
* AWS Cloud9 Environment
* Visual Studio Code

You can use your own account, or an account provided through Workshop Studio as part of an AWS organized workshop. Using an account provided by Workshop Studio is the easier path, as you will have full access to all AWS services, and the account will terminate automatically when the event is over.

Expand Down
37 changes: 14 additions & 23 deletions content/change-data-capture/setup/user-account.en.md
Original file line number Diff line number Diff line change
Expand Up @@ -6,40 +6,31 @@ chapter: true
---


::alert[Only complete this section if you are running the workshop on your own. If you are at an AWS hosted event (such as re\:Invent, Immersion Day, etc), go to :link[At an AWS hosted Event]{href="/event-driven-architecture/setup/start-here/aws-ws-event"}]
::alert[These setup instructions are identitical for LADV, LHOL, LBED, LMR, and LGME - all of which use the same Visual Studio Code template. Only complete this section once, and only if you're running it on your own account.]{type="warning"}

## Create a Cloud9 Environment
::alert[Only complete this section if you are running the workshop on your own. If you are at an AWS hosted event (such as re\:Invent, Immersion Day, etc), go to :link[At an AWS hosted Event]{href="/hands-on-labs/setup/aws-ws-event"}]

To complete the steps in these labs, you need an IAM role that has the privileges to create, update and delete AWS Cloud9 environments, Lambda functions, DynamoDB tables, IAM roles, Kinesis Data Streams and DynamoDB Streams
## Launch the CloudFormation stack
::alert[During the course of the lab, you will make DynamoDB tables that will incur a cost that could approach tens or hundreds of dollars per day. Ensure you delete the DynamoDB tables using the DynamoDB console, and make sure you delete the CloudFormation stack as soon as the lab is complete.]

* Log into the AWS Management Console, go to the AWS Cloud9 service dashboard then select **Create environment**.
1. **[Deprecated]** - Launch the CloudFormation template in US West 2 to deploy the resources in your account: [![CloudFormation](/static/images/cloudformation-launch-stack.png)](https://console.aws.amazon.com/cloudformation/home?region=us-west-2#/stacks/new?stackName=DynamoDBID&templateURL=:param{key="design_patterns_s3_lab_yaml"})

![Create Cloud9 environment](/static/images/change-data-capture/setup/cloud9-create-env.png)
1. *Optionally, download [the YAML template](https://github.com/aws-samples/aws-dynamodb-examples/blob/master/workshops/modernizer/modernizer-db.yaml) from our GitHub repository and launch it your own way*

* Give your new environment a name - **DynamoDB Labs** then provide an optional description for the environment.
1. Click *Next* on the first dialog.

![Name Cloud9 environment](/static/images/change-data-capture/setup/cloud9-name-env.png)
1. Provide a CloudFormation stack name.

* Select **t2.small** as your instance type, leave all other fields as the default values then select **Create**.
1. In the Parameters section, note the *AllowedIP** contains a default IP Address, if you want to access the instance via SSH obtain your own public IP address. Ensure to add the `/32` network mask at the end. Do not modify any other parameter and click *Next*.

![Select Cloud9 instance](/static/images/change-data-capture/setup/cloud9-select-ec2.png)
![CloudFormation parameters](/static/images/common/on-your-own-cf-01.png)

* Wait for creation of your Cloud9 environment to complete then select **Open** to launch your Cloud9 evironment.
6. Scroll to the bottom and click *Next*, and then review the *Template* and *Parameters*. When you are ready to create the stack, scroll to the bottom, check the box acknowledging the creation of IAM resources, and click *Create stack*.

![Launch Cloud9 environment](/static/images/change-data-capture/setup/cloud9-launch-env.png)
![CloudFormation parameters](/static/images/common/on-your-own-cf-02.png)

The stack will create a Visual Studio Code EC2 instance, a role for the instance, and a role for the AWS Lambda function used later on in the lab. The CloudFormation template will create a set of folders that can be used to execute individually the lab modules presented in this guide.

Start a command line terminal in Cloud9 and set up the `Region` and `Account ID` environment variables.

```bash
export REGION={your aws region} &&
export ACCOUNT_ID={your aws account ID}
```

Install jq on your AWS Cloud9 environment using the command below.

```bash
sudo yum install jq -y
```

::alert[*After completing the workshop, remember to complete the :link[Clean Up]{href="/change-data-capture/clean-up"} section to remove AWS resources that you no longer require.*]

Expand Down
76 changes: 45 additions & 31 deletions content/rdbms-migration/migration-chapter03.en.md
Original file line number Diff line number Diff line change
Expand Up @@ -10,47 +10,57 @@ The dataset has over 106K movies, ratings, votes, and cast/crew information.

The CloudFormation template launched an EC2 Amazon Linux 2 instance with MySQL installed and running.
It created a MySQL database called `imdb`, added 6 new tables (one for each IMDb dataset), downloaded the IMDb TSV files to MySQL server local directory, and loaded the file contents into the 6 tables.
The CloudFormation template also configured a remote MySQL user based on input parameters for the template.
To explore the dataset, follow the instructions below to log in to the EC2 server.

1. Go to [EC2 console](https://console.aws.amazon.com/ec2/v2/home#Instances:instanceState=running).
2. Select the MySQL Instance and click **Connect**.
![Final Deployment Architecture](/static/images/migration9.jpg)
3. Make sure `ec2-user` is in the **User name** field. Click **Connect**.
![Final Deployment Architecture](/static/images/migration10.jpg)
4. Elevate your privileges using the `sudo` command.
```bash
sudo su
```
![Final Deployment Architecture](/static/images/migration11.jpg)
5. Go to the file directory.
```bash
cd /var/lib/mysql-files/
ls -lrt
```
6. You can see all the 6 files copied from the IMDB dataset to the local EC2 directory.
![Final Deployment Architecture](/static/images/migration12.jpg)
7. Feel free to explore the files.
8. If you are completing this workshop at an AWS hosted event, go to [AWS CloudFormation Console](https://console.aws.amazon.com/cloudformation/home#/stacks?filteringStatus=active&filteringText=&viewNested=true&hideStacks=false) and select the stack named **ddb**. Go to the **Parameters** tab and copy the username and password listed next to **DbMasterUsername** and **DbMasterPassword**.

On the event dashboard, click on **Open AWS console** to federate into AWS Management Console in a new tab. On the same page, click **Get started** to open the workshop instructions.

![Event dashboard](/static/images/common/workshop-studio-01.png)

In addition to the AWS console you should open your Visual Studio code server, by clicking in the `VSCodeServerURL` parameter, available from the "Event Outputs" section. When prompted for a password use the value from `VSCodeServerPassword`.

![Event dashboard](/static/images/common/workshop-studio-02.png)

During the first 60 seconds, the environment will automatically update extensions and plugins. Any startup notification can be safely dismissed.

![VS Code Setup](/static/images/common/common-vs-code-01.png)

If a terminal is not available at the bottom left side of your screen, please open a new one like the following picture indicates.

![VS Code Setup](/static/images/common/common-vs-code-02.png)

In the terminal type:

```bash
cd LDMS

```

If you are completing this workshop at an AWS hosted event, go to [AWS CloudFormation Console](https://console.aws.amazon.com/cloudformation/home#/stacks?filteringStatus=active&filteringText=&viewNested=true&hideStacks=false) and select the stack named **ddb**. Go to the **Parameters** tab and copy the username and password listed next to **DbMasterUsername** and **DbMasterPassword**.

::alert[_If you are completing this workshop in your AWS account copy the DbMasterUsername and DbMasterPassword from the CloudFormation stack used to configure the MySQL environment._]

![Final Deployment Architecture](/static/images/migration13.jpg)
9. Go back to EC2 Instance console and login to mysql.

Go to the terminal and login to mysql.
```bash
mysql -u DbMasterUsername -pDbMasterPassword
mysql -u dbuser -p
```
![Final Deployment Architecture](/static/images/migration14.jpg)
10. Congratulations! You are now connected to a self-managed MySQL source database on EC2. In the following steps, we will explore the database and tables hosting IMDb datasets.

![Final Deployment Architecture](/static/images/LDMS/mysql-connecting.png)

Congratulations! You are now connected to a self-managed MySQL source database on EC2. In the following steps, we will explore the database and tables hosting IMDb datasets.

```bash
use imdb;
```
![Final Deployment Architecture](/static/images/migration15.jpg)
11. List all the tables created by the CloudFormation stack.

![Final Deployment Architecture](/static/images/LDMS/mysql-use-imdb.png)

List all the tables created by the CloudFormation stack.
```bash
show tables;
```
![Final Deployment Architecture](/static/images/migration16.jpg)

![Final Deployment Architecture](/static/images/LDMS/mysql-show-tables.png)

For illustration purposes, below is a logical diagram represents relationship between various source tables hosting IMDb dataset.

Expand All @@ -60,10 +70,14 @@ For illustration purposes, below is a logical diagram represents relationship be
- `title_principals` has cast and crew information. It has a 1\:many relationship with the `title_basics` table.
- `title_crew` has writer and director information. It has a 1:1 relationship with the `title_basics` table.
- `name_basics` has cast and crew details. Every member has unique `nconst` value assigned.
![Final Deployment Architecture](/static/images/migration31.jpg)

12. We will create a denormalized view with 1:1 static information and get it ready for migration to Amazon DynamoDB table. For now, go ahead and copy the code below and paste into the MySQL command line.

![Final Deployment Architecture](/static/images/migration31.jpg)

We will create a denormalized view with 1:1 static information and get it ready for migration to Amazon DynamoDB table. For now, go ahead and copy the code below and paste into the MySQL command line.

We will discuss the details around the target data model in the next chapter.

```bash
CREATE VIEW imdb.movies AS\
SELECT tp.tconst,\
Expand Down
6 changes: 3 additions & 3 deletions content/relational-migration/data migration/index2.en.md
Original file line number Diff line number Diff line change
Expand Up @@ -11,23 +11,23 @@ set into S3. We can run this script in preview mode by using the "stdout" parame

1. Run:
```bash
python3 mysql_s3.py Customers stdout
python mysql_s3.py Customers stdout
```
You should see results in DynamoDB JSON format:

![mysql_s3.py output](/static/images/relational-migration/mysql_s3_output.png)

2. Next, run it for our view:
```bash
python3 mysql_s3.py vCustOrders stdout
python mysql_s3.py vCustOrders stdout
```
You should see similar output from the view results.

The script can write these to S3 for us. We just need to omit the "stdout" command line parameter.

3. Now, run the script without preview mode:
```bash
python3 mysql_s3.py Customers
python mysql_s3.py Customers
```
You should see confirmation that objects have been written to S3:

Expand Down
Loading