A production-ready data aggregation platform for competitive programming profiles across multiple platforms (LeetCode, Codeforces, CodeChef, AtCoder, CSES, HackerRank).
This system follows a microservices architecture with:
- FastAPI Backend: REST API with async workers
- PostgreSQL: Primary database for user data, submissions, problems
- Redis: Caching layer and task queue (RQ)
- React Frontend: Dashboard for viewing aggregated profiles
- Playwright + Requests: Web scraping with fallbacks
- Kubernetes: Container orchestration with Helm charts
- Docker & Docker Compose
- Python 3.11+
- Node.js 18+
- PostgreSQL 15+
# Clone and setup
git clone <repo-url>
cd cp-portfolio-platform
# Copy environment file
cp .env.example .env
# Start services with Docker Compose
docker-compose up -d
# Install backend dependencies
cd backend
pip install -r requirements.txt
# Run database migrations
alembic upgrade head
# Seed sample data
python scripts/seed_data.py
# Install frontend dependencies
cd ../frontend
npm install
# Start development servers
npm run dev # Frontend on :3000
cd ../backend && uvicorn app.main:app --reload # Backend on :8000# Test Codeforces scraper
curl http://localhost:8000/api/v1/users/codeforces/tourist
# View dashboard
open http://localhost:3000GET /api/v1/users/{platform}/{handle}- Get user profilePOST /api/v1/users/{platform}/{handle}/sync- Trigger data syncGET /api/v1/users/{platform}/{handle}/submissions- Get submissionsGET /api/v1/platforms- List supported platformsGET /api/v1/health- Health check
POST /api/v1/auth/login- Login with credentialsPOST /api/v1/auth/register- Register new accountDELETE /api/v1/auth/logout- Logout
# Deploy to K8s cluster
kubectl apply -f k8s/namespace.yaml
kubectl apply -f k8s/
# Or use Helm
helm install cp-platform ./helm/cp-platform- Set up managed PostgreSQL
- Configure Redis cluster
- Set up monitoring (Prometheus/Grafana)
- Configure ingress with TLS
- Set up CI/CD pipeline
- Configure secrets management
- Create scraper in
backend/app/scrapers/new_platform.py - Implement the
BaseScraperinterface - Add platform configuration to
app/config/platforms.py - Add tests in
tests/scrapers/test_new_platform.py - Update documentation
- Metrics: Prometheus metrics at
/metrics - Logs: Structured JSON logs
- Health: Health checks at
/health - Tracing: OpenTelemetry support
- Rate limiting per IP and user
- Input validation and sanitization
- HTTPS enforcement
- Secure session management
- GDPR compliance with data deletion
- Respects robots.txt
# Run all tests
pytest
# Run specific test suite
pytest tests/scrapers/
pytest tests/api/
# Run with coverage
pytest --cov=app tests/MIT License - see LICENSE file.