A personal project I built to stop doomscrolling through 15 different news tabs every morning. It pulls top headlines from NewsAPI on a schedule, runs them through Google Gemini to generate a short digest, and serves everything through a simple Express app with user accounts.
The interesting part was less the app itself and more the infrastructure. I wanted to do this properly with Terraform, a real VPC setup and a CI/CD pipeline that doesn't store long-lived AWS credentials anywhere.
graph TD
User([User]) --> ALB[Load Balancer]
ALB --> EB["Elastic Beanstalk · Node.js 22\nPublic Subnet"]
EB -->|port 3306 · SG restricted| DB[("Aurora Serverless v2\nMySQL · Private Subnet")]
EB --> NAT[NAT Gateway]
NAT --> NewsAPI([NewsAPI])
NAT --> Gemini([Google Gemini])
GHA["GitHub Actions · OIDC"] --> IAM[IAM Role]
IAM --> S3[(S3 Artifacts)]
S3 --> EB
Aurora lives in a private subnet, there is no public endpoint, and the security group only allows port 3306 traffic from the Beanstalk instance. The NAT gateway gives the private subnet outbound access to NewsAPI and Gemini without any inbound exposure. GitHub Actions authenticates to AWS via OIDC so there are no long-lived credentials sitting in repo secrets.
Full infrastructure breakdown in infra/README.md.
| Backend | Node.js · Express.js |
| Views | EJS (server-side rendered) |
| Database | AWS Aurora Serverless v2 (MySQL) |
| AI | Google Gemini gemini-2.5-flash-lite |
| News | NewsAPI |
| Infra | Terraform |
| Hosting | AWS Elastic Beanstalk |
| CI/CD | GitHub Actions (OIDC) |
| Tests | Jest · Supertest |
| Auth | bcrypt · express-session |
- Users register and log in — passwords are hashed with bcrypt, sessions are server-side
- A cron job fetches the top 30 headlines from NewsAPI every hour and stores them
- Every 6 hours another cron job sends those headlines to Gemini and stores the returned summary
- Logged-in users see the latest digest and can browse individual articles
- Admins get a separate panel behind a role-based guard
You'll need Node.js 22+ and a MySQL-compatible database (local MySQL is fine).
cd server
cp .env.example .env # fill in DB creds, API keys, SESSION_SECRET
npm install
npm run migrate # creates tables
npm run dev # http://localhost:3000Tests don't need a real database — the DB and APIs are fully mocked:
npm test
npm run test:coverage├── .github/workflows/deploy.yml # CI/CD — packages and deploys to Beanstalk on push to main
├── infra/ # Terraform — VPC, Aurora, Beanstalk, IAM, S3
└── server/
├── app.js
├── controllers/
├── routes/
├── middleware/ # requireAuth, adminOnly
├── services/scheduler/ # newsFetcher.js, aiDigest.js
├── db/ # connection pool, migrations
├── views/ # EJS templates
├── public/ # CSS, client JS
└── tests/ # Jest + Supertest
Push to main → GitHub Actions packages the app → uploads to S3 → deploys to Elastic Beanstalk. Infrastructure is managed separately with Terraform from infra/.
- Docker + docker-compose for local dev (currently needs a local MySQL install)
- Swap EJS for a React frontend
- Multi-model support — let users pick between Gemini, GPT-4o, and Claude for their digest
- Personalised digests based on reading history and saved topics
- Email delivery via SES — daily digest to your inbox without logging in
- Switch from session auth to JWT