This project demonstrates a complete serverless CRUD microservices architecture using 8 core AWS services: Lambda, API Gateway, DynamoDB, Cognito, S3, SNS, SQS, and CloudWatch. It includes comprehensive local development support with LocalStack integration and in-memory mocking for quick testing.
| AWS Service | Purpose | Implementation | Local Development |
|---|---|---|---|
| AWS Lambda | Serverless compute functions | 8 handler functions for auth, users, products, orders, files, notifications | Serverless-offline |
| AWS API Gateway | REST API endpoints & routing | HTTP events, CORS, path parameters, Cognito authorizers | Express.js server |
| AWS DynamoDB | NoSQL database | 3 tables (Users, Products, Orders) with PAY_PER_REQUEST billing | In-memory Map storage |
| AWS Cognito | User authentication & authorization | User Pool + Client with JWT token validation | Mock JWT with HMAC-SHA256 |
| AWS S3 | Object storage for files | File upload/download/list/delete with CORS | LocalStack + in-memory fallback |
| AWS SNS | Event publishing | Event notifications for CRUD operations | Console logging mock |
| AWS SQS | Message queuing | Async message processing for notifications | Console logging mock |
| AWS CloudWatch | Monitoring & logging | Custom metrics and structured logging | Local console output |
- LocalStack: Local AWS service emulation
- Serverless Framework: Infrastructure as Code
- Node.js 18.x: Runtime environment
- Jest: Unit testing framework
- Joi: Input validation
- JWT: Token-based authentication
- User, Product, Order, and Notification microservices
- AWS S3 File Storage - Upload, download, list, and delete files with LocalStack support
- JWT Authentication & Authorization - Secure API access with user-scoped permissions
- Input validation with Joi
- Messaging with SNS and SQS (mocked for local development)
- Local development with in-memory database mocking
- LocalStack Integration - Full local AWS service emulation
- In-Memory Fallback - Mock storage when LocalStack unavailable
- Unit tests with Jest
handlers/ # Lambda function handlers for each microservice
users.js # User CRUD operations
products.js # Product CRUD operations
orders.js # Order CRUD operations
notifications.js# Event processing
auth.js # Authentication (signup, signin, profile)
files.js # File storage operations (S3 integration)
utils/ # Shared utility modules
dynamodb.js # Database operations (with local mocking)
messaging.js # SNS/SQS operations (with local mocking)
mockdb.js # In-memory database for local development
mockAuth.js # In-memory authentication for local development
validation.js # Joi validation schemas
response.js # HTTP response utilities
extractUser.js # User extraction utilities
auth.js # JWT token handling
s3.js # S3 operations with LocalStack support
tests/ # Jest test files
serverless.yml # Serverless Framework configuration
package.json # Project dependencies and scripts
docker-compose.yml# Local AWS service emulation (optional)
S3_INTEGRATION.md # Detailed S3 integration documentation
- Node.js 20.x (use nvm-windows to manage versions)
- Serverless Framework (
npm install -g serverless)
-
Clone and install dependencies:
git clone <your-repo> cd aws-tutorial-1 npm install
-
Set environment variables for local development:
# For Git Bash/WSL export AWS_ACCESS_KEY_ID=test export AWS_SECRET_ACCESS_KEY=test export IS_OFFLINE=true # For Windows Command Prompt set AWS_ACCESS_KEY_ID=test set AWS_SECRET_ACCESS_KEY=test set IS_OFFLINE=true # For PowerShell $env:AWS_ACCESS_KEY_ID="test" $env:AWS_SECRET_ACCESS_KEY="test" $env:IS_OFFLINE="true"
-
Start the local server:
serverless offline
-
API will be available at:
http://localhost:3000
-
Sign Up:
POST http://localhost:3000/dev/auth/signupcurl -X POST http://localhost:3000/dev/auth/signup \ -H "Content-Type: application/json" \ -d '{"email":"user@example.com","password":"TestPass123","name":"John Doe"}'
-
Sign In:
POST http://localhost:3000/dev/auth/signincurl -X POST http://localhost:3000/dev/auth/signin \ -H "Content-Type: application/json" \ -d '{"email":"user@example.com","password":"TestPass123"}'
-
Get Profile:
GET http://localhost:3000/dev/auth/profile(Requires JWT token)curl -H "Authorization: Bearer YOUR_JWT_TOKEN" http://localhost:3000/dev/auth/profile
-
Create User:
POST http://localhost:3000/dev/userscurl -H "Authorization: Bearer YOUR_JWT_TOKEN" \ -X POST http://localhost:3000/dev/users \ -H "Content-Type: application/json" \ -d '{"name":"John Doe","email":"john@example.com","age":30,"phone":"+1234567890"}'
-
Get User:
GET http://localhost:3000/dev/users/{id}(Requires Authentication*) -
Update User:
PUT http://localhost:3000/dev/users/{id}(Requires Authentication*) -
Delete User:
DELETE http://localhost:3000/dev/users/{id}(Requires Authentication*)
-
Create Product:
POST http://localhost:3000/dev/products(Requires Authentication*)curl -H "Authorization: Bearer YOUR_JWT_TOKEN" \ -X POST http://localhost:3000/dev/products \ -H "Content-Type: application/json" \ -d '{"name":"Test Product","description":"A test product","price":29.99,"category":"Electronics"}'
-
Get All Products:
GET http://localhost:3000/dev/products(Public) -
Get Product:
GET http://localhost:3000/dev/products/{id}(Public) -
Update Product:
PUT http://localhost:3000/dev/products/{id}(Requires Authentication*) -
Delete Product:
DELETE http://localhost:3000/dev/products/{id}(Requires Authentication*)
- Create Order:
POST http://localhost:3000/dev/orders - Get All Orders:
GET http://localhost:3000/dev/orders
-
Upload File:
POST http://localhost:3000/dev/files/uploadcurl -H "Authorization: Bearer YOUR_JWT_TOKEN" \ -X POST http://localhost:3000/dev/files/upload \ -H "Content-Type: application/json" \ -d '{"fileName":"document.txt","fileContent":"SGVsbG8gV29ybGQ=","contentType":"text/plain"}'
-
List Files:
GET http://localhost:3000/dev/files?userOnly=truecurl -H "Authorization: Bearer YOUR_JWT_TOKEN" \ "http://localhost:3000/dev/files?userOnly=true"
-
Download File:
GET http://localhost:3000/dev/files/{key} -
Delete File:
DELETE http://localhost:3000/dev/files/{key} -
Generate Upload URL:
POST http://localhost:3000/dev/files/upload-url
See detailed S3 integration guide in S3_INTEGRATION.md
*Note: Authentication is enforced in production but bypassed in local development for easier testing.
- Local Development: Uses JWT tokens with HMAC-SHA256 signing
- Production: Uses AWS Cognito User Pools with RS256 signing
- Mock Authentication: In-memory user storage for local testing
- Seamless Switching: Automatically detects environment and switches auth methods
- Uses
utils/mockdb.jsfor local development - Data persists during the serverless offline session
- Automatically switches between mock (local) and real AWS (production)
- DynamoDB: In-memory JavaScript Map storage
- SNS/SQS: Console logging instead of actual messaging
- Cognito: JWT token generation and validation for local development
The application automatically detects local development through:
IS_OFFLINE=trueenvironment variableNODE_ENV=developmentenvironment variable
npm testSet these in your environment or a .env file:
USERS_TABLE- DynamoDB table name for usersPRODUCTS_TABLE- DynamoDB table name for products
-
"Missing credentials in config" Error
# Set dummy AWS credentials before starting serverless offline export AWS_ACCESS_KEY_ID=test export AWS_SECRET_ACCESS_KEY=test export IS_OFFLINE=true
-
Port Already in Use
# Kill existing Node.js processes taskkill //F //IM node.exe # Windows # Or use a different port serverless offline --httpPort 3001
-
Java Not Found (DynamoDB Local)
- This project uses in-memory mocking instead of DynamoDB Local
- No Java installation required for local development
-
Empty Results from GET Endpoints
- Data is stored in-memory during the serverless offline session
- Create some data first using POST endpoints
- Data is lost when serverless offline is restarted
If you need to switch Node.js versions:
-
Install nvm-windows:
- Download from: https://github.com/coreybutler/nvm-windows/releases
- Install the
.exefile
-
Use Node.js 20.x:
nvm install 20.19.3 nvm use 20.19.3
-
Verify installation:
node --version # Should show v20.x.x npm --version
# Navigate to project directory
cd aws-tutorial-1
# Set environment variables (choose your platform)
# For Git Bash/WSL:
export AWS_ACCESS_KEY_ID=test
export AWS_SECRET_ACCESS_KEY=test
export IS_OFFLINE=true
# For Windows Command Prompt:
set AWS_ACCESS_KEY_ID=test
set AWS_SECRET_ACCESS_KEY=test
set IS_OFFLINE=true
# For PowerShell:
$env:AWS_ACCESS_KEY_ID="test"
$env:AWS_SECRET_ACCESS_KEY="test"
$env:IS_OFFLINE="true"
# Start the serverless offline server
serverless offlineWait for the server to start. You should see output like:
Server ready: http://localhost:3003 π
Step 2.1: Sign Up a New User
curl -X POST http://localhost:3003/dev/auth/signup \
-H "Content-Type: application/json" \
-d '{
"email": "testuser@example.com",
"password": "TempPass123!",
"name": "Test User"
}'Expected Response:
{
"success": true,
"data": {
"message": "User created successfully",
"user": {
"id": "49b38958-0c49-4bb3-919b-6d40b4c66177",
"email": "testuser@example.com",
"name": "Test User"
},
"token": "eyJhbGciOiJIUzI1NiIsInR5cCI6IkpXVCJ9...",
"needsConfirmation": false
}
}Step 2.2: Save the JWT Token Copy the token from the response above and save it as an environment variable:
# Save the token for future requests
export JWT_TOKEN="eyJhbGciOiJIUzI1NiIsInR5cCI6IkpXVCJ9..."Step 2.3: Test Authentication with Profile
curl -X GET http://localhost:3003/dev/auth/profile \
-H "Authorization: Bearer $JWT_TOKEN"Expected Response:
{
"success": true,
"data": {
"user": {
"userId": "49b38958-0c49-4bb3-919b-6d40b4c66177",
"email": "testuser@example.com",
"name": "Test User"
}
}
}Step 3.1: Create a User Profile
curl -X POST http://localhost:3003/dev/users \
-H "Content-Type: application/json" \
-H "Authorization: Bearer $JWT_TOKEN" \
-d '{
"name": "John Doe",
"email": "john@example.com",
"age": 30,
"phone": "+1234567890"
}'Expected Response:
{
"success": true,
"data": {
"id": "user-123-456",
"name": "John Doe",
"email": "john@example.com",
"age": 30,
"phone": "+1234567890",
"createdAt": "2025-01-05T20:30:00.000Z",
"updatedAt": "2025-01-05T20:30:00.000Z"
}
}Step 3.2: Get User Profile (save the user ID from step 3.1)
# Replace USER_ID with the actual ID from the create response
export USER_ID="user-123-456"
curl -X GET http://localhost:3003/dev/users/$USER_ID \
-H "Authorization: Bearer $JWT_TOKEN"Step 3.3: Update User Profile
curl -X PUT http://localhost:3003/dev/users/$USER_ID \
-H "Content-Type: application/json" \
-H "Authorization: Bearer $JWT_TOKEN" \
-d '{
"name": "John Smith",
"email": "johnsmith@example.com",
"age": 31,
"phone": "+1234567891"
}'Step 4.1: Create Products
# Create first product
curl -X POST http://localhost:3003/dev/products \
-H "Content-Type: application/json" \
-H "Authorization: Bearer $JWT_TOKEN" \
-d '{
"name": "Laptop",
"description": "High-performance laptop for developers",
"price": 1299.99,
"category": "Electronics"
}'
# Create second product
curl -X POST http://localhost:3003/dev/products \
-H "Content-Type: application/json" \
-H "Authorization: Bearer $JWT_TOKEN" \
-d '{
"name": "Wireless Mouse",
"description": "Ergonomic wireless mouse",
"price": 29.99,
"category": "Electronics"
}'Step 4.2: List All Products (Public - No Authentication Required)
curl -X GET http://localhost:3003/dev/productsStep 4.3: Get Specific Product (save product ID from step 4.1)
export PRODUCT_ID="product-123-456"
curl -X GET http://localhost:3003/dev/products/$PRODUCT_IDStep 4.4: Update Product
curl -X PUT http://localhost:3003/dev/products/$PRODUCT_ID \
-H "Content-Type: application/json" \
-H "Authorization: Bearer $JWT_TOKEN" \
-d '{
"name": "Gaming Laptop",
"description": "High-performance gaming laptop",
"price": 1499.99,
"category": "Gaming"
}'Step 5.1: Create Orders
# Create first order
curl -X POST http://localhost:3003/dev/orders \
-H "Content-Type: application/json" \
-H "Authorization: Bearer $JWT_TOKEN" \
-d '{
"userId": "'$USER_ID'",
"productId": "'$PRODUCT_ID'",
"quantity": 1,
"totalAmount": 1499.99
}'
# Create second order
curl -X POST http://localhost:3003/dev/orders \
-H "Content-Type: application/json" \
-H "Authorization: Bearer $JWT_TOKEN" \
-d '{
"userId": "'$USER_ID'",
"productId": "another-product-id",
"quantity": 2,
"totalAmount": 59.98
}'Step 5.2: List All Orders
curl -X GET http://localhost:3003/dev/orders \
-H "Authorization: Bearer $JWT_TOKEN"Step 6.1: Upload a File
# Create base64 encoded content
echo "Hello from S3 test file!" | base64 > /tmp/content.txt
CONTENT=$(cat /tmp/content.txt)
# Upload the file
curl -X POST http://localhost:3003/dev/files/upload \
-H "Content-Type: application/json" \
-H "Authorization: Bearer $JWT_TOKEN" \
-d '{
"fileName": "test-document.txt",
"fileContent": "'$CONTENT'",
"contentType": "text/plain",
"metadata": {
"description": "Test file upload",
"category": "documents"
}
}'Expected Response:
{
"success": true,
"data": {
"message": "File uploaded successfully",
"file": {
"key": "uploads/49b38958-0c49-4bb3-919b-6d40b4c66177/1751732678214-test-document.txt",
"originalName": "test-document.txt",
"location": "http://localhost:3000/mock-s3/uploads/...",
"contentType": "text/plain",
"uploadedBy": "49b38958-0c49-4bb3-919b-6d40b4c66177",
"uploadedAt": "2025-01-05T20:30:00.000Z"
}
}
}Step 6.2: List User's Files
curl -X GET "http://localhost:3003/dev/files?userOnly=true&maxKeys=10" \
-H "Authorization: Bearer $JWT_TOKEN"Step 6.3: Download a File (save the file key from step 6.1)
export FILE_KEY="uploads/49b38958-0c49-4bb3-919b-6d40b4c66177/1751732678214-test-document.txt"
curl -X GET "http://localhost:3003/dev/files/$FILE_KEY" \
-H "Authorization: Bearer $JWT_TOKEN"Step 6.4: Generate Presigned Upload URL
curl -X POST http://localhost:3003/dev/files/upload-url \
-H "Content-Type: application/json" \
-H "Authorization: Bearer $JWT_TOKEN" \
-d '{
"fileName": "large-document.pdf",
"contentType": "application/pdf",
"expiresIn": 3600
}'Step 6.5: Delete a File
curl -X DELETE "http://localhost:3003/dev/files/$FILE_KEY" \
-H "Authorization: Bearer $JWT_TOKEN"Create a file called test-api.sh with all the commands:
#!/bin/bash
# Set base URL
BASE_URL="http://localhost:3003/dev"
echo "π Starting API Test Suite..."
# 1. Sign up
echo "π 1. Creating user account..."
SIGNUP_RESPONSE=$(curl -s -X POST $BASE_URL/auth/signup \
-H "Content-Type: application/json" \
-d '{"email":"demo@example.com","password":"DemoPass123!","name":"Demo User"}')
echo "β
Signup Response: $SIGNUP_RESPONSE"
# Extract token
JWT_TOKEN=$(echo $SIGNUP_RESPONSE | jq -r '.data.token')
echo "π JWT Token extracted: ${JWT_TOKEN:0:50}..."
# 2. Create user profile
echo "π€ 2. Creating user profile..."
USER_RESPONSE=$(curl -s -X POST $BASE_URL/users \
-H "Content-Type: application/json" \
-H "Authorization: Bearer $JWT_TOKEN" \
-d '{"name":"Demo User","email":"demo@example.com","age":25,"phone":"+1234567890"}')
echo "β
User Created: $USER_RESPONSE"
# Extract user ID
USER_ID=$(echo $USER_RESPONSE | jq -r '.data.id')
# 3. Create product
echo "π¦ 3. Creating product..."
PRODUCT_RESPONSE=$(curl -s -X POST $BASE_URL/products \
-H "Content-Type: application/json" \
-H "Authorization: Bearer $JWT_TOKEN" \
-d '{"name":"Demo Product","description":"A demo product","price":99.99,"category":"Demo"}')
echo "β
Product Created: $PRODUCT_RESPONSE"
# Extract product ID
PRODUCT_ID=$(echo $PRODUCT_RESPONSE | jq -r '.data.id')
# 4. List products
echo "π 4. Listing all products..."
curl -s -X GET $BASE_URL/products | jq .
# 5. Create order
echo "π 5. Creating order..."
ORDER_RESPONSE=$(curl -s -X POST $BASE_URL/orders \
-H "Content-Type: application/json" \
-H "Authorization: Bearer $JWT_TOKEN" \
-d "{\"userId\":\"$USER_ID\",\"productId\":\"$PRODUCT_ID\",\"quantity\":1,\"totalAmount\":99.99}")
echo "β
Order Created: $ORDER_RESPONSE"
# 6. Upload file
echo "π 6. Uploading file..."
CONTENT=$(echo "Hello from API test!" | base64)
FILE_RESPONSE=$(curl -s -X POST $BASE_URL/files/upload \
-H "Content-Type: application/json" \
-H "Authorization: Bearer $JWT_TOKEN" \
-d "{\"fileName\":\"test.txt\",\"fileContent\":\"$CONTENT\",\"contentType\":\"text/plain\"}")
echo "β
File Uploaded: $FILE_RESPONSE"
# 7. List files
echo "π 7. Listing user files..."
curl -s -X GET "$BASE_URL/files?userOnly=true" \
-H "Authorization: Bearer $JWT_TOKEN" | jq .
echo "π API Test Suite Completed!"Make it executable and run:
chmod +x test-api.sh
./test-api.shYou can also test GET endpoints directly in your browser:
- View all products: http://localhost:3003/dev/products
- Health check: http://localhost:3003/dev/health (if implemented)
Problem: "Unauthorized" errors
- Make sure you're using the correct JWT token
- Check that the token hasn't expired
- Verify the Authorization header format:
Bearer <token>
Problem: "User not found" errors
- Make sure you're using the correct user ID from the creation response
- Check that the user was created successfully
Problem: Connection refused
- Verify serverless offline is running on the correct port
- Check for port conflicts (try port 3003 if 3000 is busy)
Problem: Base64 encoding issues
- For Windows:
powershell -command "[Convert]::ToBase64String([Text.Encoding]::UTF8.GetBytes('Hello World'))" - For Linux/Mac:
echo "Hello World" | base64
This completes the comprehensive step-by-step testing guide for your AWS CRUD microservices with S3 integration! π―
Open your browser and visit:
http://localhost:3000/dev/products- Get all productshttp://localhost:3000/dev/orders- Get all orders
Complete production-ready deployment using Terraform with all 8 AWS services.
π¦ What Gets Deployed:
- 22 Lambda Functions (Auth, CRUD, File management, Notifications)
- API Gateway REST API with complete endpoint configuration
- 3 DynamoDB Tables with backups and encryption
- Cognito User Pool & Client with security policies
- S3 Bucket with encryption, versioning, and lifecycle policies
- SNS Topic for event publishing
- SQS Queue + Dead Letter Queue for reliable messaging
- CloudWatch logs, dashboard, and alarms
π Quick Start:
# Navigate to terraform directory
cd terraform
# Configure deployment variables
cp terraform.tfvars.example terraform.tfvars
# Edit terraform.tfvars with your settings
# Deploy infrastructure
terraform init
terraform plan
terraform apply
# Get API URL and important outputs
terraform output api_gateway_url
terraform output deployment_summaryπ° Production Cost (estimated monthly):
- Lambda: $5-15 (1M requests)
- API Gateway: $3.50 (1M requests)
- DynamoDB: $10-25 (pay-per-request)
- S3: $0.023/GB + requests
- Other services: $5-10
- Total: ~$25-75/month for moderate usage
π§ Key Production Features:
- Automatic scaling and high availability
- CloudWatch monitoring and alerting
- Security best practices implemented
- Backup and disaster recovery
- Cost optimization settings
See /terraform/README.md for detailed deployment guide.
Prerequisites:
- AWS CLI configured with appropriate permissions
- Serverless Framework installed globally
- All 8 AWS services will be automatically provisioned
Step 1: Configure AWS Credentials
aws configure
# Enter your AWS Access Key ID, Secret Access Key, and RegionStep 2: Deploy Infrastructure
# Deploy to development stage
serverless deploy --stage dev
# Deploy to production stage
serverless deploy --stage productionStep 3: Verify Deployment
# Check CloudFormation stack
aws cloudformation describe-stacks --stack-name crud-microservices-production
# Test API endpoints
curl https://your-api-id.execute-api.us-east-1.amazonaws.com/production/products| AWS Service | Resource Created | Configuration |
|---|---|---|
| Lambda | 8 Functions | Node.js 18.x runtime, 1GB memory |
| API Gateway | REST API | CORS enabled, Cognito authorizers |
| DynamoDB | 3 Tables | PAY_PER_REQUEST billing mode |
| Cognito | User Pool + Client | Email auth, password policies |
| S3 | Files Bucket | CORS configured, versioning enabled |
| SNS | Events Topic | Event publishing |
| SQS | Notifications Queue | Message processing |
| CloudWatch | Log Groups + Metrics | Auto-created monitoring |
Production Environment Variables:
PRODUCTS_TABLE: crud-microservices-products-production
USERS_TABLE: crud-microservices-users-production
ORDERS_TABLE: crud-microservices-orders-production
SNS_TOPIC: crud-microservices-events-production
SQS_QUEUE: crud-microservices-notifications-production
S3_BUCKET: crud-microservices-files-production
COGNITO_USER_POOL_ID: !Ref CognitoUserPool
COGNITO_CLIENT_ID: !Ref CognitoUserPoolClientLocal Development vs Production:
# Local Development
IS_OFFLINE=true
NODE_ENV=development
JWT_SECRET=local-development-secret
# Production (automatically configured)
IS_OFFLINE=false
NODE_ENV=production
JWT_SECRET=# Uses Cognito insteadAWS Free Tier Eligible:
- Lambda: 1M free requests/month
- DynamoDB: 25GB storage, 25 RCU/WCU
- S3: 5GB storage, 20K GET requests
- CloudWatch: 10 custom metrics
Estimated Monthly Cost (beyond free tier):
- Lambda: ~$0.20 per 1M requests
- DynamoDB: ~$0.25 per GB/month
- S3: ~$0.023 per GB/month
- API Gateway: ~$3.50 per 1M requests
IAM Permissions: Least privilege principle
# Each Lambda function has minimal required permissions
# No wildcard (*) permissions in production
# Separate IAM roles per function typeCognito Security:
# Strong password policies enforced
# Email verification required
# JWT tokens expire after 24 hours
# Refresh tokens valid for 30 daysS3 Security:
# User-scoped file access (uploads/{userId}/)
# No public read/write access
# CORS properly configured
# Presigned URLs for secure uploadsCloudWatch Dashboards: Auto-created for each service Custom Metrics: Request counts, latencies, error rates Log Aggregation: Structured JSON logging Alarms: Set up for error rates and latencies
Monitoring Commands:
# View logs
aws logs describe-log-groups --log-group-name-prefix "/aws/lambda/crud-microservices"
# Get metrics
aws cloudwatch get-metric-statistics --namespace CrudMicroservices --metric-name RequestCount- Fork the repository
- Create a feature branch
- Make your changes
- Test locally with
serverless offline - Submit a pull request
MIT License npm install --save-dev serverless-offline
Start the local API Gateway and Lambda emulation:
```bash
serverless offline
This will start your API on a local port (usually http://localhost:3000).
You can use curl, Postman, or any HTTP client to test your endpoints. Example using curl:
Create a user:
curl -X POST http://localhost:3000/users \
-H 'Content-Type: application/json' \
-d '{"name":"John Doe","email":"john@example.com","age":30,"phone":"+1234567890"}'Get a user:
curl http://localhost:3000/users/{userId}Update a user:
curl -X PUT http://localhost:3000/users/{userId} \
-H 'Content-Type: application/json' \
-d '{"name":"Jane Doe","email":"jane@example.com","age":28,"phone":"+1234567890"}'Delete a user:
curl -X DELETE http://localhost:3000/users/{userId}You can also run the included Jest tests:
npm testTip:
- Check the Serverless Offline output for the exact URLs and ports.
- Make sure your environment variables are set or your
.envfile is loaded. - You can use Postman for more advanced API testing.
- Deploy to AWS:
npm run deploy
- Run tests:
npm test - Start local AWS emulation:
docker-compose up -d
- DynamoDB Local and LocalStack are used for local development and testing. No real AWS resources are required.
- See
serverless.ymlfor function and resource definitions. - See
docker-compose.ymlfor local service configuration.
- Runtime: Node.js 18.x
- Functions: 8 serverless functions
# Authentication - signup, signin, confirmSignup, getProfile # CRUD Operations - createUser, getUser, updateUser, deleteUser - createProduct, getProducts, getProduct, updateProduct, deleteProduct - createOrder, getOrders # File Management - uploadFile, getFile, listFiles, deleteFile, generateUploadUrl # Event Processing - processNotification
- Type: REST API
- Features:
- HTTP events for all endpoints
- CORS enabled for cross-origin requests
- Cognito User Pool authorizers
- Path parameters and query strings
- Base URL:
https://api-id.execute-api.region.amazonaws.com/stage/
Tables:
UsersTable:
- Primary Key: id (String)
- Billing: PAY_PER_REQUEST
ProductsTable:
- Primary Key: id (String)
- Billing: PAY_PER_REQUEST
OrdersTable:
- Primary Key: id (String)
- Billing: PAY_PER_REQUESTUser Pool Configuration:
- Username: email
- Auto-verified: email
- Password Policy: 8+ chars, upper/lower/numbers
- Token Validity: 24h access, 30d refresh
User Pool Client:
- Auth Flows: USER_PASSWORD_AUTH, ADMIN_NO_SRP_AUTH
- No client secret (for web/mobile apps)FilesBucket:
- CORS: Enabled for all origins/methods
- Public Access: Configured for file sharing
- Organization: uploads/{userId}/{timestamp}-{filename}
- Operations: Upload, Download, List, Delete, Presigned URLsSNS Topic: EventsTopic
- Purpose: Event publishing (UserCreated, ProductUpdated, etc.)
SQS Queue: NotificationsQueue
- Purpose: Async message processing
- Integration: SNS topic subscriptionMonitoring:
- Custom Metrics: Request counts, latencies, errors
- Structured Logging: JSON format with timestamps
- Namespace: CrudMicroservices
- Log Groups: Auto-created per Lambda functionLambda Execution Role:
DynamoDB: Query, Scan, GetItem, PutItem, UpdateItem, DeleteItem
S3: GetObject, PutObject, DeleteObject, ListBucket
SNS: Publish
SQS: SendMessage, ReceiveMessage, DeleteMessage
CloudWatch: PutMetricData, CreateLogGroup, CreateLogStream, PutLogEventsThis project successfully implements ALL 8 major AWS services in a production-ready serverless microservices architecture:
| β Service | β Implementation | β Purpose |
|---|---|---|
| AWS Lambda | 8 Functions | Serverless compute for all business logic |
| AWS API Gateway | REST API | HTTP endpoints with routing and CORS |
| AWS DynamoDB | 3 Tables | NoSQL database for users, products, orders |
| AWS Cognito | User Pool + Client | Authentication and authorization |
| AWS S3 | Files Bucket | Object storage for file uploads |
| AWS SNS | Events Topic | Event publishing and notifications |
| AWS SQS | Notifications Queue | Async message processing |
| AWS CloudWatch | Monitoring + Logging | Metrics, logs, and observability |
- 100% Serverless: No server management required
- Local Development: Complete LocalStack + in-memory mocking
- Production Ready: Full AWS deployment with CloudFormation
- Security First: JWT authentication, user-scoped access, IAM least privilege
- Comprehensive Testing: Step-by-step examples and automated test scripts
- Cost Optimized: Pay-per-request billing, AWS Free Tier eligible
- Monitoring: Built-in CloudWatch metrics and structured logging
- User Management: Complete user lifecycle with authentication
- Product Catalog: Full CRUD operations for product management
- Order Processing: Order creation and management system
- File Storage: Secure file upload/download with user isolation
- Event Processing: Async notifications and event-driven architecture
- API Gateway: RESTful APIs with proper HTTP status codes
- Instant Setup: One command to start local development
- Hot Reload: Automatic code reloading during development
- Mock Services: No AWS account needed for development
- Environment Parity: Same code runs locally and in production
- Comprehensive Docs: Step-by-step guides and troubleshooting
- Auto Scaling: Lambda functions scale automatically
- High Availability: Multi-AZ deployment across AWS regions
- Cost Efficient: Pay only for actual usage
- Secure: Industry-standard security practices
- Monitored: Full observability with CloudWatch
This project demonstrates a complete, enterprise-grade serverless microservices architecture using the full spectrum of AWS cloud services.
Perfect for learning AWS serverless patterns, building production applications, or as a foundation for larger microservices ecosystems! π




