This repository contains a set of Python scripts for managing AWS services (EC2, S3, and DynamoDB) using the boto3 SDK and command-line arguments. It was developed as part of a lab assignment for cloud infrastructure and computing courses.
A script to automate EC2-related tasks such as:
- Creating EC2 instances with optional user-data scripts
- Tagging and describing instances
- Listing all running instances
- Starting, stopping, or terminating instances
- Creating and assigning security groups
Handles typical operations on Amazon S3:
- Creating buckets in any region
- Uploading files and folders
- Listing objects
- Making files public
- Deleting objects and buckets
Performs operations on Amazon DynamoDB:
- Creating tables with a primary key
- Loading JSON data from S3
- Printing all table contents
- Deleting tables
Each script can be run from the terminal with specific commands:
Use these CLI commands to manage your Amazon EC2 instances via aws_cli_ec2.py
.
# Create a new EC2 instance
python aws_cli_ec2.py create-instance --ami-id <AMI_ID> --type <INSTANCE_TYPE> --key <KEY_NAME> --sg <SECURITY_GROUP_NAME> [--script <PATH_TO_SCRIPT>]
# Modify instance tags (e.g., Name and Description)
python aws_cli_ec2.py tag --id <INSTANCE_ID> --tag Name=MyInstance --tag Description=Test
# Get detailed information about an instance
python aws_cli_ec2.py describe --id <INSTANCE_ID>
# List all instances
python aws_cli_ec2.py list
# Change the security group of an instance (e.g., open ports 22 and 80)
python aws_cli_ec2.py change-sg --id <INSTANCE_ID> --sg <SECURITY_GROUP_NAME> --ports 22,80
# Stop a running instance
python aws_cli_ec2.py stop --id <INSTANCE_ID>
# Start a stopped instance
python aws_cli_ec2.py start --id <INSTANCE_ID>
# Terminate (delete) an instance
python aws_cli_ec2.py terminate --id <INSTANCE_ID>
# Create a new S3 bucket
python aws_cli_s3.py create-bucket --name <BUCKET_NAME> --region <REGION>
# Upload a file to the bucket
python aws_cli_s3.py upload-file --bucket <BUCKET_NAME> --file <FILE_PATH> [--key <OBJECT_KEY>]
# Upload an entire folder to the bucket
python aws_cli_s3.py upload-folder --bucket <BUCKET_NAME> --folder <FOLDER_PATH>
# List all files in the bucket
python aws_cli_s3.py list --bucket <BUCKET_NAME>
# Make a file publicly accessible
python aws_cli_s3.py make-public --bucket <BUCKET_NAME> --key <OBJECT_KEY>
# Delete a specific file from the bucket
python aws_cli_s3.py delete-object --bucket <BUCKET_NAME> --key <OBJECT_KEY>
# Delete the bucket (removes all contents first)
python aws_cli_s3.py delete-bucket --bucket <BUCKET_NAME>
# Create a new DynamoDB table
python aws_cli_dynamodb.py create --name <TABLE_NAME> --key <PARTITION_KEY> --type S
# Load JSON data from an S3 file into the DynamoDB table
python aws_cli_dynamodb.py load --table <TABLE_NAME> --bucket <BUCKET_NAME> --key <S3_OBJECT_KEY>
# Retrieve all items from the table
python aws_cli_dynamodb.py get --table <TABLE_NAME>
# Delete the DynamoDB table
python aws_cli_dynamodb.py delete --table <TABLE_NAME>
Note: Before running these scripts, ensure AWS credentials are properly configured (via aws configure or
~/.aws/credentials
), and that required permissions are granted for EC2, S3, and DynamoDB.