A coding assessment platform that allows candidates to solve programming questions and submit their code for automated evaluation. The system uses AWS services for code execution and result processing.
Based on the provided architecture diagram, this platform consists of three main servers running on AWS EC2 instances:
[Frontend Server] ---> [Backend Server] ---> [Worker Server]
| | |
(Candidate (Questions (Code
Interface) Controllers) Execution)
| |
[RDS PostgreSQL] [Docker Containers]
| |
[AWS S3 Bucket] <---> [SQS Queues]
Frontend Server (EC2 Instance)
- React application running in Docker
- Candidate interface for coding questions
- Submit button for code submissions
Backend Server (EC2 Instance)
- Spring Boot application running in Docker
- Questions Controller - serves coding questions from database
- Submission Controller - handles code submissions to SQS
- Results Controller - retrieves execution results from SQS
Worker Server (EC2 Instance)
- Spring Boot application running in Docker
- Processes code execution requests from SQS
- Runs code in isolated Docker containers (Python, Java)
- Sends execution results back via SQS
AWS Services
- RDS PostgreSQL - stores coding questions and results
- S3 Bucket - stores submitted code files
- SQS Request Queue - sends code execution requests to worker
- SQS Response Queue - receives execution results from worker
- Candidate opens the frontend application
- Frontend requests coding questions from Backend (Questions Controller)
- Backend retrieves questions from RDS PostgreSQL database
- Candidate writes code and clicks Submit button
- Frontend sends code to Backend (Submission Controller)
- Backend stores code file in S3 bucket
- Backend sends execution request to SQS Request Queue
- Worker server picks up request from SQS Request Queue
- Worker downloads code file from S3 bucket
- Worker executes code in appropriate Docker container (Python/Java)
- Worker sends execution results to SQS Response Queue
- Backend (Results Controller) retrieves results from SQS Response Queue
- Backend stores results in RDS PostgreSQL database
- Frontend displays execution results to candidate## Project Structure
coding-platform/
├── frontend/ # React application
│ ├── src/ # React source code
│ ├── public/ # Static files
│ ├── package.json # Node.js dependencies
│ └── Dockerfile # Frontend container
│
├── backend/ # Spring Boot backend
│ ├── src/main/java/ # Java source code
│ ├── src/main/resources/ # Configuration files
│ ├── pom.xml # Maven dependencies
│ └── Dockerfile # Backend container
│
├── worker/ # Spring Boot worker
│ ├── src/main/java/ # Java source code
│ ├── src/main/resources/ # Configuration files
│ ├── pom.xml # Maven dependencies
│ └── Dockerfile # Worker container
│
├── infrastructure/ # Infrastructure setup
│ ├── terraform/ # AWS infrastructure
│ └── ansible/ # Server configuration
│
├── docker-compose.yml # Local development
└── README.md # This file
Frontend
- React with TypeScript
- Next.js framework
- Running on port 3000
Backend
- Java 17
- Spring Boot
- PostgreSQL database
- AWS SQS and S3 integration
- Running on port 8080
Worker
- Java 17
- Spring Boot
- Docker containers for code execution
- AWS SQS and S3 integration
- Running on port 8081
Infrastructure
- AWS EC2 instances
- AWS RDS PostgreSQL
- AWS S3 bucket
- AWS SQS queues
- Docker containers
- Java 17 or higher
- Node.js 18 or higher
- Maven 3.8 or higher
- Docker and Docker Compose
- AWS account with access to EC2, RDS, S3, and SQS
Before running the application, you need to configure environment variables for AWS services and database connections.
Create environment variables for the backend service in backend/src/main/resources/application.properties:
# These variables need to be set in your environment
DB_URL=jdbc:postgresql://your-rds-endpoint:5432/your-database-name
DB_USERNAME=your-database-username
DB_PASSWORD=your-database-password
QUEUE_URL=https://sqs.your-region.amazonaws.com/your-account-id/your-request-queue
RESULT_URL=https://sqs.your-region.amazonaws.com/your-account-id/your-response-queue
BUCKET_NAME=your-s3-bucket-nameSet the same environment variables for the worker service in worker/src/main/resources/application.properties:
# These variables need to be set in your environment
QUEUE_URL=https://sqs.your-region.amazonaws.com/your-account-id/your-request-queue
RESULT_URL=https://sqs.your-region.amazonaws.com/your-account-id/your-response-queue
BUCKET_NAME=your-s3-bucket-nameSet AWS credentials in your environment:
export AWS_ACCESS_KEY_ID=your-access-key
export AWS_SECRET_ACCESS_KEY=your-secret-key
export AWS_REGION=your-aws-regionUpdate the Terraform configuration in infrastructure/terraform/main.tf with your actual values:
- Replace "YOUR_ACCOUNT_ID" with your AWS account ID
- Replace "YOUR_RDS_ENDPOINT" with your RDS endpoint
- Replace "YOUR_BUCKET_NAME" with your S3 bucket name
- Replace "YOUR_QUEUE_NAMES" with your SQS queue names
Update the Ansible configuration in infrastructure/ansible/secrets.auto.yml with your actual values:
- AWS credentials and region
- Database connection details
- SQS queue URLs
- S3 bucket name
You can use the provided docker-compose.yml for local development:
# Start only PostgreSQL for local development
docker-compose up -d postgresOr set up your own PostgreSQL database and update the connection details.
cd backend
mvn clean install
mvn spring-boot:runThe backend will start on http://localhost:8080
cd worker
mvn clean install
mvn spring-boot:runThe worker will start on http://localhost:8081
cd frontend
npm install
npm startThe frontend will start on http://localhost:3000
Deploy AWS infrastructure using Terraform:
cd infrastructure/terraform
terraform init
terraform plan
terraform applyThis will create:
- EC2 instances for frontend, backend, and worker
- RDS PostgreSQL database
- S3 bucket for code storage
- SQS queues for request/response handling
Configure the servers using Ansible:
cd infrastructure/ansible
ansible-playbook -i hosts.ini playbook.ymlThis will:
- Install Docker on all servers
- Deploy application containers
- Configure nginx and networking
- Set up monitoring and logging
Deploy using Docker on each server:
# On each server, pull and run the appropriate container
docker run -d -p 3000:3000 your-frontend-image
docker run -d -p 8080:8080 your-backend-image
docker run -d -p 8081:8081 your-worker-imageAfter deployment:
- Frontend: http://your-frontend-server-ip:3000
- Backend API: http://your-backend-server-ip:8080
- Worker API: http://your-worker-server-ip:8081
- Security: All placeholder values in configuration files must be replaced with actual values
- AWS Setup: Ensure your AWS account has sufficient permissions for EC2, RDS, S3, and SQS
- Database: Make sure PostgreSQL database is accessible from backend server
- Networking: Ensure security groups allow communication between services
- Monitoring: Check application logs for any configuration errors
This is a personal project, but contributions are welcome if you're interested in helping improve it.
- Fork the repository
- Create a feature branch:
git checkout -b feature/your-feature - Make your changes and test them
- Commit your changes:
git commit -m 'Add your feature' - Push to the branch:
git push origin feature/your-feature - Open a Pull Request
- Follow existing code style
- Test your changes before submitting
- Update documentation if needed
Developer: Teja Naidu Koppineni
- GitHub: @tkoppine
- Email: tkoppine@asu.edu
For questions or issues, please create an issue on GitHub.