diff --git a/content/authors.en.md b/content/authors.en.md index 4a3aa8fa..a5337a44 100644 --- a/content/authors.en.md +++ b/content/authors.en.md @@ -14,6 +14,10 @@ weight: 100 1. Daniel Yoder ([danielsyoder](https://github.com/danielsyoder)) - The brains behind amazon-dynamodb-labs.com and the co-creator of the design scenarios ### 2025 additions +zETL Workshop update with OS pipeline changes (October 2025): +1. John Terhune - ([@terhunej](https://github.com/terhunej)) - Primary author +2. Esteban Serna - ([@tebanieo](https://github.com/tebanieo)) - Editor, Tech reviewer and merger. + Removing Cloud9 due to End of Life from all the workshops (October 2025): 1. Esteban Serna ([@tebanieo](https://github.com/tebanieo)) - Primary author, and merger diff --git a/content/change-data-capture/overview/create-tables.en.md b/content/change-data-capture/overview/create-tables.en.md index 41f75846..ebae73cd 100644 --- a/content/change-data-capture/overview/create-tables.en.md +++ b/content/change-data-capture/overview/create-tables.en.md @@ -9,12 +9,12 @@ In this section you create the DynamoDB tables you will use during the labs for In the commands below, the **create-table** AWS CLI command is used to create two new tables called Orders and OrdersHistory. -It will create the Orders table in provisioned capacity mode to have 5 read capacity units (RCU), 5 write capacity uints (WCU) and a partition key named `id`. +It will create the Orders table in on-demand capacity mode with a partition key named `id`. -It will also create the OrdersHistory table in provisioned capacity mode to have 5 RCU, 5 WCU, a partition key named `pk` and a sort key named `sk`. +It will also create the OrdersHistory table in on-demand capacity mode with a partition key named `pk` and a sort key named `sk`. * Copy the **create-table** commands below and paste them into your command terminal. -* Execute the commands to to create two tables named Orders and OrdersHistory. +* Execute the commands to to create two tables named `Orders` and `OrdersHistory`. ```bash aws dynamodb create-table \ @@ -23,8 +23,7 @@ aws dynamodb create-table \ AttributeName=id,AttributeType=S \ --key-schema \ AttributeName=id,KeyType=HASH \ - --provisioned-throughput \ - ReadCapacityUnits=5,WriteCapacityUnits=5 \ + --billing-mode PAY_PER_REQUEST \ --query "TableDescription.TableStatus" aws dynamodb create-table \ @@ -35,9 +34,9 @@ aws dynamodb create-table \ --key-schema \ AttributeName=pk,KeyType=HASH \ AttributeName=sk,KeyType=RANGE \ - --provisioned-throughput \ - ReadCapacityUnits=5,WriteCapacityUnits=5 \ + --billing-mode PAY_PER_REQUEST \ --query "TableDescription.TableStatus" + ``` Run the command below to confirm that both tables have been created. diff --git a/content/change-data-capture/setup/aws-ws-event.en.md b/content/change-data-capture/setup/aws-ws-event.en.md index ea9fc093..fcc4816c 100644 --- a/content/change-data-capture/setup/aws-ws-event.en.md +++ b/content/change-data-capture/setup/aws-ws-event.en.md @@ -7,30 +7,29 @@ chapter: true ### Login to AWS Workshop Studio Portal -1. If you are provided a one-click join link, skip to step 3. +1. If you are provided a one-click join link, use it and skip to step 3. 2. Visit [https://catalog.us-east-1.prod.workshops.aws](https://catalog.us-east-1.prod.workshops.aws). If you attended any other workshop earlier on this portal, please logout first. Click on **Get Started** on the right hand side of the window. - ![Workshop Studio Landing Page](/static/images/aws-ws-event1.png) 3. On the next, **Sign in** page, choose **Email One-Time Passcode (OTP)** to sign in to your workshop page. - ![Sign in page](/static/images/aws-ws-event2.png) 4. Provide an email address to receive a one-time passcode. - ![Email address input](/static/images/aws-ws-event3.png) 5. Enter the passcode that you received in the provided email address, and click **Sign in**. 6. Next, in the textbox, enter the event access code (eg: abcd-012345-ef) that you received from the event facilitators. If you are provided a one-click join link, you will be redirected to the next step automatically. - ![Event access code](/static/images/aws-ws-event4.png) 7. Select on **I agree with the Terms and Conditions** on the bottom of the next page and click **Join event** to continue to the event dashboard. 8. On the event dashboard, click on **Open AWS console** to federate into AWS Management Console in a new tab. On the same page, click **Get started** to open the workshop instructions. +![Event dashboard](/static/images/common/workshop-studio-01.png) + +9. In addition to the AWS console you should open your Visual Studio code server, by clicking in the `VSCodeServerURL` parameter, available from the "Event Outputs" section. When prompted for a password use the value from `VSCodeServerPassword`. -![Event dashboard](/static/images/aws-ws-event5.png) +![Event dashboard](/static/images/common/workshop-studio-02.png) Now that you are set up, continue on to: :link[2. Scenario Overview]{href="/change-data-capture/overview"}. diff --git a/content/change-data-capture/setup/index.en.md b/content/change-data-capture/setup/index.en.md index 9beea65e..460d2f9c 100644 --- a/content/change-data-capture/setup/index.en.md +++ b/content/change-data-capture/setup/index.en.md @@ -14,7 +14,7 @@ To run this lab, you will need an AWS account, and a user identity with access t * Amazon Kinesis * AWS Lambda * Amazon Simple Queue Service -* AWS Cloud9 Environment +* Visual Studio Code You can use your own account, or an account provided through Workshop Studio as part of an AWS organized workshop. Using an account provided by Workshop Studio is the easier path, as you will have full access to all AWS services, and the account will terminate automatically when the event is over. diff --git a/content/change-data-capture/setup/user-account.en.md b/content/change-data-capture/setup/user-account.en.md index 8a1252ec..e0a6955b 100644 --- a/content/change-data-capture/setup/user-account.en.md +++ b/content/change-data-capture/setup/user-account.en.md @@ -6,40 +6,31 @@ chapter: true --- -::alert[Only complete this section if you are running the workshop on your own. If you are at an AWS hosted event (such as re\:Invent, Immersion Day, etc), go to :link[At an AWS hosted Event]{href="/event-driven-architecture/setup/start-here/aws-ws-event"}] +::alert[These setup instructions are identitical for LADV, LHOL, LBED, LMR, and LGME - all of which use the same Visual Studio Code template. Only complete this section once, and only if you're running it on your own account.]{type="warning"} -## Create a Cloud9 Environment +::alert[Only complete this section if you are running the workshop on your own. If you are at an AWS hosted event (such as re\:Invent, Immersion Day, etc), go to :link[At an AWS hosted Event]{href="/hands-on-labs/setup/aws-ws-event"}] -To complete the steps in these labs, you need an IAM role that has the privileges to create, update and delete AWS Cloud9 environments, Lambda functions, DynamoDB tables, IAM roles, Kinesis Data Streams and DynamoDB Streams +## Launch the CloudFormation stack +::alert[During the course of the lab, you will make DynamoDB tables that will incur a cost that could approach tens or hundreds of dollars per day. Ensure you delete the DynamoDB tables using the DynamoDB console, and make sure you delete the CloudFormation stack as soon as the lab is complete.] -* Log into the AWS Management Console, go to the AWS Cloud9 service dashboard then select **Create environment**. +1. **[Deprecated]** - Launch the CloudFormation template in US West 2 to deploy the resources in your account: [![CloudFormation](/static/images/cloudformation-launch-stack.png)](https://console.aws.amazon.com/cloudformation/home?region=us-west-2#/stacks/new?stackName=DynamoDBID&templateURL=:param{key="design_patterns_s3_lab_yaml"}) -![Create Cloud9 environment](/static/images/change-data-capture/setup/cloud9-create-env.png) +1. *Optionally, download [the YAML template](https://github.com/aws-samples/aws-dynamodb-examples/blob/master/workshops/modernizer/modernizer-db.yaml) from our GitHub repository and launch it your own way* -* Give your new environment a name - **DynamoDB Labs** then provide an optional description for the environment. +1. Click *Next* on the first dialog. -![Name Cloud9 environment](/static/images/change-data-capture/setup/cloud9-name-env.png) +1. Provide a CloudFormation stack name. -* Select **t2.small** as your instance type, leave all other fields as the default values then select **Create**. +1. In the Parameters section, note the *AllowedIP** contains a default IP Address, if you want to access the instance via SSH obtain your own public IP address. Ensure to add the `/32` network mask at the end. Do not modify any other parameter and click *Next*. -![Select Cloud9 instance](/static/images/change-data-capture/setup/cloud9-select-ec2.png) +![CloudFormation parameters](/static/images/common/on-your-own-cf-01.png) -* Wait for creation of your Cloud9 environment to complete then select **Open** to launch your Cloud9 evironment. +6. Scroll to the bottom and click *Next*, and then review the *Template* and *Parameters*. When you are ready to create the stack, scroll to the bottom, check the box acknowledging the creation of IAM resources, and click *Create stack*. -![Launch Cloud9 environment](/static/images/change-data-capture/setup/cloud9-launch-env.png) +![CloudFormation parameters](/static/images/common/on-your-own-cf-02.png) + + The stack will create a Visual Studio Code EC2 instance, a role for the instance, and a role for the AWS Lambda function used later on in the lab. The CloudFormation template will create a set of folders that can be used to execute individually the lab modules presented in this guide. -Start a command line terminal in Cloud9 and set up the `Region` and `Account ID` environment variables. - -```bash -export REGION={your aws region} && -export ACCOUNT_ID={your aws account ID} -``` - -Install jq on your AWS Cloud9 environment using the command below. - -```bash -sudo yum install jq -y -``` ::alert[*After completing the workshop, remember to complete the :link[Clean Up]{href="/change-data-capture/clean-up"} section to remove AWS resources that you no longer require.*] diff --git a/content/rdbms-migration/migration-chapter03.en.md b/content/rdbms-migration/migration-chapter03.en.md index 88cf1841..22bd2f0b 100644 --- a/content/rdbms-migration/migration-chapter03.en.md +++ b/content/rdbms-migration/migration-chapter03.en.md @@ -10,47 +10,57 @@ The dataset has over 106K movies, ratings, votes, and cast/crew information. The CloudFormation template launched an EC2 Amazon Linux 2 instance with MySQL installed and running. It created a MySQL database called `imdb`, added 6 new tables (one for each IMDb dataset), downloaded the IMDb TSV files to MySQL server local directory, and loaded the file contents into the 6 tables. -The CloudFormation template also configured a remote MySQL user based on input parameters for the template. -To explore the dataset, follow the instructions below to log in to the EC2 server. - - 1. Go to [EC2 console](https://console.aws.amazon.com/ec2/v2/home#Instances:instanceState=running). - 2. Select the MySQL Instance and click **Connect**. - ![Final Deployment Architecture](/static/images/migration9.jpg) - 3. Make sure `ec2-user` is in the **User name** field. Click **Connect**. - ![Final Deployment Architecture](/static/images/migration10.jpg) - 4. Elevate your privileges using the `sudo` command. - ```bash - sudo su - ``` - ![Final Deployment Architecture](/static/images/migration11.jpg) - 5. Go to the file directory. - ```bash - cd /var/lib/mysql-files/ - ls -lrt - ``` - 6. You can see all the 6 files copied from the IMDB dataset to the local EC2 directory. - ![Final Deployment Architecture](/static/images/migration12.jpg) - 7. Feel free to explore the files. - 8. If you are completing this workshop at an AWS hosted event, go to [AWS CloudFormation Console](https://console.aws.amazon.com/cloudformation/home#/stacks?filteringStatus=active&filteringText=&viewNested=true&hideStacks=false) and select the stack named **ddb**. Go to the **Parameters** tab and copy the username and password listed next to **DbMasterUsername** and **DbMasterPassword**. + +On the event dashboard, click on **Open AWS console** to federate into AWS Management Console in a new tab. On the same page, click **Get started** to open the workshop instructions. + +![Event dashboard](/static/images/common/workshop-studio-01.png) + +In addition to the AWS console you should open your Visual Studio code server, by clicking in the `VSCodeServerURL` parameter, available from the "Event Outputs" section. When prompted for a password use the value from `VSCodeServerPassword`. + +![Event dashboard](/static/images/common/workshop-studio-02.png) + +During the first 60 seconds, the environment will automatically update extensions and plugins. Any startup notification can be safely dismissed. + +![VS Code Setup](/static/images/common/common-vs-code-01.png) + +If a terminal is not available at the bottom left side of your screen, please open a new one like the following picture indicates. + +![VS Code Setup](/static/images/common/common-vs-code-02.png) + +In the terminal type: + +```bash +cd LDMS + +``` + + If you are completing this workshop at an AWS hosted event, go to [AWS CloudFormation Console](https://console.aws.amazon.com/cloudformation/home#/stacks?filteringStatus=active&filteringText=&viewNested=true&hideStacks=false) and select the stack named **ddb**. Go to the **Parameters** tab and copy the username and password listed next to **DbMasterUsername** and **DbMasterPassword**. ::alert[_If you are completing this workshop in your AWS account copy the DbMasterUsername and DbMasterPassword from the CloudFormation stack used to configure the MySQL environment._] ![Final Deployment Architecture](/static/images/migration13.jpg) - 9. Go back to EC2 Instance console and login to mysql. + + Go to the terminal and login to mysql. ```bash - mysql -u DbMasterUsername -pDbMasterPassword + mysql -u dbuser -p ``` - ![Final Deployment Architecture](/static/images/migration14.jpg) -10. Congratulations! You are now connected to a self-managed MySQL source database on EC2. In the following steps, we will explore the database and tables hosting IMDb datasets. + + ![Final Deployment Architecture](/static/images/LDMS/mysql-connecting.png) + +Congratulations! You are now connected to a self-managed MySQL source database on EC2. In the following steps, we will explore the database and tables hosting IMDb datasets. + ```bash use imdb; ``` - ![Final Deployment Architecture](/static/images/migration15.jpg) -11. List all the tables created by the CloudFormation stack. + + ![Final Deployment Architecture](/static/images/LDMS/mysql-use-imdb.png) + +List all the tables created by the CloudFormation stack. ```bash show tables; ``` - ![Final Deployment Architecture](/static/images/migration16.jpg) + + ![Final Deployment Architecture](/static/images/LDMS/mysql-show-tables.png) For illustration purposes, below is a logical diagram represents relationship between various source tables hosting IMDb dataset. @@ -60,10 +70,14 @@ For illustration purposes, below is a logical diagram represents relationship be - `title_principals` has cast and crew information. It has a 1\:many relationship with the `title_basics` table. - `title_crew` has writer and director information. It has a 1:1 relationship with the `title_basics` table. - `name_basics` has cast and crew details. Every member has unique `nconst` value assigned. - ![Final Deployment Architecture](/static/images/migration31.jpg) -12. We will create a denormalized view with 1:1 static information and get it ready for migration to Amazon DynamoDB table. For now, go ahead and copy the code below and paste into the MySQL command line. + +![Final Deployment Architecture](/static/images/migration31.jpg) + +We will create a denormalized view with 1:1 static information and get it ready for migration to Amazon DynamoDB table. For now, go ahead and copy the code below and paste into the MySQL command line. + We will discuss the details around the target data model in the next chapter. + ```bash CREATE VIEW imdb.movies AS\ SELECT tp.tconst,\ diff --git a/content/relational-migration/data migration/index2.en.md b/content/relational-migration/data migration/index2.en.md index ca66d86c..bc52c5ff 100644 --- a/content/relational-migration/data migration/index2.en.md +++ b/content/relational-migration/data migration/index2.en.md @@ -11,7 +11,7 @@ set into S3. We can run this script in preview mode by using the "stdout" parame 1. Run: ```bash -python3 mysql_s3.py Customers stdout +python mysql_s3.py Customers stdout ``` You should see results in DynamoDB JSON format: @@ -19,7 +19,7 @@ You should see results in DynamoDB JSON format: 2. Next, run it for our view: ```bash -python3 mysql_s3.py vCustOrders stdout +python mysql_s3.py vCustOrders stdout ``` You should see similar output from the view results. @@ -27,7 +27,7 @@ The script can write these to S3 for us. We just need to omit the "stdout" comma 3. Now, run the script without preview mode: ```bash -python3 mysql_s3.py Customers +python mysql_s3.py Customers ``` You should see confirmation that objects have been written to S3: diff --git a/design-patterns/cloudformation/C9.yaml b/design-patterns/cloudformation/vscode.yaml similarity index 93% rename from design-patterns/cloudformation/C9.yaml rename to design-patterns/cloudformation/vscode.yaml index a5bff7ec..a09b1cea 100644 --- a/design-patterns/cloudformation/C9.yaml +++ b/design-patterns/cloudformation/vscode.yaml @@ -1117,132 +1117,7 @@ Resources: exit 1 } - # Setup IMDB database and tables - echo "Setting up IMDB database..." - mysql -u root -p"$DbMasterPassword" -e "CREATE DATABASE IF NOT EXISTS imdb;" || { - echo "ERROR: Failed to create imdb database" - exit 1 - } - - # Download and extract data files - echo "Downloading IMDB data files..." - cd /var/lib/mysql-files/ || { - echo "ERROR: Could not change to mysql-files directory" - exit 1 - } - - # Download with retry - retry_command curl -L -o rdbms-migration.zip https://www.amazondynamodblabs.com/static/rdbms-migration/rdbms-migration.zip || { - echo "ERROR: Failed to download data files" - exit 1 - } - - # Extract files - echo "Extracting data files..." - unzip -q rdbms-migration.zip || { - echo "ERROR: Failed to extract data files" - exit 1 - } - - # Set proper permissions - chmod 644 *.tsv 2>/dev/null || echo "Warning: Could not set permissions on TSV files" - chown mysql:mysql *.tsv 2>/dev/null || echo "Warning: Could not change ownership of TSV files" - - # Create tables - echo "Creating IMDB tables..." - mysql -u root -p"$DbMasterPassword" -e " - CREATE TABLE IF NOT EXISTS imdb.title_akas ( - titleId VARCHAR(200), - ordering VARCHAR(200), - title VARCHAR(1000), - region VARCHAR(1000), - language VARCHAR(1000), - types VARCHAR(1000), - attributes VARCHAR(1000), - isOriginalTitle VARCHAR(5), - PRIMARY KEY (titleId, ordering) - );" || echo "Warning: Failed to create title_akas table" - - mysql -u root -p"$DbMasterPassword" -e " - CREATE TABLE IF NOT EXISTS imdb.title_basics ( - tconst VARCHAR(200), - titleType VARCHAR(1000), - primaryTitle VARCHAR(1000), - originalTitle VARCHAR(1000), - isAdult VARCHAR(1000), - startYear VARCHAR(1000), - endYear VARCHAR(1000), - runtimeMinutes VARCHAR(1000), - genres VARCHAR(1000), - PRIMARY KEY (tconst) - );" || echo "Warning: Failed to create title_basics table" - - mysql -u root -p"$DbMasterPassword" -e " - CREATE TABLE IF NOT EXISTS imdb.title_crew ( - tconst VARCHAR(200), - directors VARCHAR(1000), - writers VARCHAR(1000), - PRIMARY KEY (tconst) - );" || echo "Warning: Failed to create title_crew table" - - mysql -u root -p"$DbMasterPassword" -e " - CREATE TABLE IF NOT EXISTS imdb.title_principals ( - tconst VARCHAR(200), - ordering VARCHAR(200), - nconst VARCHAR(200), - category VARCHAR(1000), - job VARCHAR(1000), - characters VARCHAR(1000), - PRIMARY KEY (tconst,ordering,nconst) - );" || echo "Warning: Failed to create title_principals table" - - mysql -u root -p"$DbMasterPassword" -e " - CREATE TABLE IF NOT EXISTS imdb.title_ratings ( - tconst VARCHAR(200), - averageRating FLOAT, - numVotes INTEGER, - PRIMARY KEY (tconst) - );" || echo "Warning: Failed to create title_ratings table" - - mysql -u root -p"$DbMasterPassword" -e " - CREATE TABLE IF NOT EXISTS imdb.name_basics ( - nconst VARCHAR(200), - primaryName VARCHAR(1000), - birthYear VARCHAR(1000), - deathYear VARCHAR(1000), - primaryProfession VARCHAR(1000), - knownForTitles VARCHAR(1000), - PRIMARY KEY (nconst) - );" || echo "Warning: Failed to create name_basics table" - - # Load data with error handling - echo "Loading data into tables..." - - # Function to load data with error handling - load_data() { - local file=$1 - local table=$2 - - if [ -f "$file" ]; then - echo "Loading data from $file into $table..." - mysql -u root -p"$DbMasterPassword" -e " - LOAD DATA INFILE '/var/lib/mysql-files/$file' - IGNORE INTO TABLE imdb.$table - FIELDS TERMINATED BY '\t' - LINES TERMINATED BY '\n' - IGNORE 1 LINES;" || echo "Warning: Failed to load data from $file" - else - echo "Warning: File $file not found" - fi - } - - # Load all data files - load_data "title_ratings.tsv" "title_ratings" - load_data "title_basics.tsv" "title_basics" - load_data "title_crew.tsv" "title_crew" - load_data "title_principals.tsv" "title_principals" - load_data "name_basics.tsv" "name_basics" - load_data "title_akas.tsv" "title_akas" + # Note: IMDB database setup is now handled by the VSCode instance SSM document # Verify setup echo "Verifying database setup..." @@ -2229,6 +2104,161 @@ Resources: sudo -u ${VSCodeUser} bash -c 'cd ${VSCodeHomeFolder}/LGAM/backend && source ~/.bashrc && export NVM_DIR="$HOME/.nvm" && [ -s "$NVM_DIR/nvm.sh" ] && \. "$NVM_DIR/nvm.sh" && npm run db:seed' - echo "Database seeding completed successfully." - echo "Database setup completed successfully." + - name: SetupIMDBDatabase + action: aws:runShellScript + inputs: + timeoutSeconds: 1200 + runCommand: + - "#!/bin/bash" + - "set -euo pipefail" + - echo "Setting up IMDB database and tables..." + - !Sub | + # Create IMDB database + mysql -u ${DbMasterUsername} -p"${DbMasterPassword}" -e "CREATE DATABASE IF NOT EXISTS imdb;" || { + echo "ERROR: Failed to create imdb database" + exit 1 + } + - echo "IMDB database created successfully." + - echo "Downloading IMDB data files..." + - !Sub | + # Create temporary directory for IMDB data + mkdir -p /tmp/imdb-data + cd /tmp/imdb-data + - !Sub | + # Download IMDB data files with retry + for i in {1..3}; do + if curl -L -o rdbms-migration.zip https://www.amazondynamodblabs.com/static/rdbms-migration/rdbms-migration.zip; then + echo "Download successful on attempt $i" + break + else + echo "Download failed on attempt $i, retrying..." + sleep 5 + fi + done + - echo "Extracting IMDB data files..." + - !Sub | + cd /tmp/imdb-data + unzip -q rdbms-migration.zip || { + echo "ERROR: Failed to extract data files" + exit 1 + } + - echo "Creating IMDB tables..." + - !Sub | + # Create IMDB tables + mysql -u ${DbMasterUsername} -p"${DbMasterPassword}" -e " + CREATE TABLE IF NOT EXISTS imdb.title_akas ( + titleId VARCHAR(200), + ordering VARCHAR(200), + title VARCHAR(1000), + region VARCHAR(1000), + language VARCHAR(1000), + types VARCHAR(1000), + attributes VARCHAR(1000), + isOriginalTitle VARCHAR(5), + PRIMARY KEY (titleId, ordering) + );" || echo "Warning: Failed to create title_akas table" + - !Sub | + mysql -u ${DbMasterUsername} -p"${DbMasterPassword}" -e " + CREATE TABLE IF NOT EXISTS imdb.title_basics ( + tconst VARCHAR(200), + titleType VARCHAR(1000), + primaryTitle VARCHAR(1000), + originalTitle VARCHAR(1000), + isAdult VARCHAR(1000), + startYear VARCHAR(1000), + endYear VARCHAR(1000), + runtimeMinutes VARCHAR(1000), + genres VARCHAR(1000), + PRIMARY KEY (tconst) + );" || echo "Warning: Failed to create title_basics table" + - !Sub | + mysql -u ${DbMasterUsername} -p"${DbMasterPassword}" -e " + CREATE TABLE IF NOT EXISTS imdb.title_crew ( + tconst VARCHAR(200), + directors VARCHAR(1000), + writers VARCHAR(1000), + PRIMARY KEY (tconst) + );" || echo "Warning: Failed to create title_crew table" + - !Sub | + mysql -u ${DbMasterUsername} -p"${DbMasterPassword}" -e " + CREATE TABLE IF NOT EXISTS imdb.title_principals ( + tconst VARCHAR(200), + ordering VARCHAR(200), + nconst VARCHAR(200), + category VARCHAR(1000), + job VARCHAR(1000), + characters VARCHAR(1000), + PRIMARY KEY (tconst,ordering,nconst) + );" || echo "Warning: Failed to create title_principals table" + - !Sub | + mysql -u ${DbMasterUsername} -p"${DbMasterPassword}" -e " + CREATE TABLE IF NOT EXISTS imdb.title_ratings ( + tconst VARCHAR(200), + averageRating FLOAT, + numVotes INTEGER, + PRIMARY KEY (tconst) + );" || echo "Warning: Failed to create title_ratings table" + - !Sub | + mysql -u ${DbMasterUsername} -p"${DbMasterPassword}" -e " + CREATE TABLE IF NOT EXISTS imdb.name_basics ( + nconst VARCHAR(200), + primaryName VARCHAR(1000), + birthYear VARCHAR(1000), + deathYear VARCHAR(1000), + primaryProfession VARCHAR(1000), + knownForTitles VARCHAR(1000), + PRIMARY KEY (nconst) + );" || echo "Warning: Failed to create name_basics table" + - echo "Loading IMDB data into tables..." + - !Sub | + # Copy files to MySQL secure file directory + sudo cp /tmp/imdb-data/*.tsv /var/lib/mysql-files/ 2>/dev/null || { + echo "Warning: Could not copy TSV files to MySQL secure directory" + # Try alternative approach - load from tmp directory with LOCAL INFILE + mysql -u ${DbMasterUsername} -p"${DbMasterPassword}" -e "SET GLOBAL local_infile=1;" + } + - !Sub | + # Function to load data with error handling + load_imdb_data() { + local file=$1 + local table=$2 + + if [ -f "/var/lib/mysql-files/$file" ]; then + echo "Loading data from $file into $table..." + mysql -u ${DbMasterUsername} -p"${DbMasterPassword}" -e " + LOAD DATA INFILE '/var/lib/mysql-files/$file' + IGNORE INTO TABLE imdb.$table + FIELDS TERMINATED BY '\t' + LINES TERMINATED BY '\n' + IGNORE 1 LINES;" || echo "Warning: Failed to load data from $file" + elif [ -f "/tmp/imdb-data/$file" ]; then + echo "Loading data from /tmp/imdb-data/$file into $table..." + mysql -u ${DbMasterUsername} -p"${DbMasterPassword}" --local-infile=1 -e " + LOAD DATA LOCAL INFILE '/tmp/imdb-data/$file' + IGNORE INTO TABLE imdb.$table + FIELDS TERMINATED BY '\t' + LINES TERMINATED BY '\n' + IGNORE 1 LINES;" || echo "Warning: Failed to load data from $file" + else + echo "Warning: File $file not found" + fi + } + - !Sub | + # Load all IMDB data files + cd /tmp/imdb-data + load_imdb_data "title_ratings.tsv" "title_ratings" + load_imdb_data "title_basics.tsv" "title_basics" + load_imdb_data "title_crew.tsv" "title_crew" + load_imdb_data "title_principals.tsv" "title_principals" + load_imdb_data "name_basics.tsv" "name_basics" + load_imdb_data "title_akas.tsv" "title_akas" + - echo "Verifying IMDB database setup..." + - !Sub | + mysql -u ${DbMasterUsername} -p"${DbMasterPassword}" -e "USE imdb; SHOW TABLES;" || echo "Warning: Could not show imdb tables" + mysql -u ${DbMasterUsername} -p"${DbMasterPassword}" -e "USE imdb; SELECT COUNT(*) as title_basics_count FROM title_basics;" || echo "Warning: Could not count title_basics records" + - echo "Cleaning up temporary files..." + - rm -rf /tmp/imdb-data + - echo "IMDB database setup completed successfully." - name: SetupFrontEnd action: aws:runShellScript inputs: @@ -2962,13 +2992,17 @@ Outputs: Description: Role Arn Value: !GetAtt CodeInstanceRole.Arn Export: - Name: CodeInstanceRole + Name: CodeInstanceRoleArn VSCodeServerURL: Description: VSCode-Server URL Value: !Sub https://${CloudFrontDistribution.DomainName}/?folder=${VSCodeHomeFolder}&tkn=${SecretPlaintext.password} + Export: + Name: VSCodeUrl VSCodeServerPassword: Description: VSCode-Server Password Value: !GetAtt SecretPlaintext.password + Export: + Name: VSCodePassword VSCodeServerURLModernizer: Description: VSCode-Server with Modernizer workspace Value: !Sub https://${CloudFrontDistribution.DomainName}/?folder=${VSCodeHomeFolder}/LGAM&tkn=${SecretPlaintext.password} diff --git a/static/images/LDMS/mysql-connecting.png b/static/images/LDMS/mysql-connecting.png new file mode 100644 index 00000000..98528112 Binary files /dev/null and b/static/images/LDMS/mysql-connecting.png differ diff --git a/static/images/LDMS/mysql-show-tables.png b/static/images/LDMS/mysql-show-tables.png new file mode 100644 index 00000000..86132739 Binary files /dev/null and b/static/images/LDMS/mysql-show-tables.png differ diff --git a/static/images/LDMS/mysql-use-imdb.png b/static/images/LDMS/mysql-use-imdb.png new file mode 100644 index 00000000..09cd2e03 Binary files /dev/null and b/static/images/LDMS/mysql-use-imdb.png differ diff --git a/sync.sh b/sync.sh index 2d65ad26..96b8b8b3 100755 --- a/sync.sh +++ b/sync.sh @@ -51,7 +51,7 @@ dest_dirs=( # Define source and destination file pairs src_files=( - "design-patterns/cloudformation/C9.yaml" + "design-patterns/cloudformation/vscode.yaml" ) dest_files=(