A full-stack todo application with Express backend and PostgreSQL database.
📚 Teaching this course? See TEACHING.md for a complete 2-day curriculum guide.
☁️ Deploying to AWS? See DEPLOYMENT.md for browser-based AWS deployment guides.
- Node.js (v14 or higher)
- npm or yarn
- Node.js (v14 or higher)
- Docker and Docker Compose
- npm or yarn
git clone <your-repo-url>
cd webdev-expressPerfect for learning Express basics without database complexity:
cd backend
npm install
npm run dev:simpleThis runs the server with JSON file storage. Great for beginners!
From the project root directory:
docker-compose up -dThis will:
- Start a PostgreSQL container
- Build and start the Express backend container
- Create the database and tables automatically
- Insert sample data
- Backend runs on
http://localhost:3000 - PostgreSQL runs on port 5432
To check if everything is running:
docker-compose psTo view logs:
# All services
docker-compose logs -f
# Just backend
docker-compose logs -f backend
# Just database
docker-compose logs -f postgresThe frontend still runs locally (not containerized):
cd frontend
npm install
npm run devRun everything in containers:
docker-compose up -d- Pros: Isolated, consistent, easy deployment
- Cons: Slower to rebuild after code changes
Run only the database in Docker, backend locally:
# Start just the database
docker-compose up -d postgres
# In another terminal, run backend locally
cd backend
npm install
npm run dev- Pros: Hot reload with nodemon, faster iteration
- Cons: Need Node.js installed locally
- Note: Update
DB_HOSTtolocalhostin your local.envfile
GET /todos- Get all todosGET /todos/count- Get count of todosGET /todos/:id- Get a single todoPOST /todos- Create a new todoPUT /todos/:id- Update a todoDELETE /todos/:id- Delete a todo
docker-compose downdocker-compose down -vdocker-compose restartdocker-compose restart backenddocker-compose up -d --build backenddocker-compose psdocker exec -it todo-postgres psql -U todouser -d tododbdocker exec -it todo-backend shUseful SQL commands:
-- View all todos
SELECT * FROM todos;
-- Count todos
SELECT COUNT(*) FROM todos;
-- Drop and recreate table (careful!)
DROP TABLE todos;CREATE TABLE todos (
id SERIAL PRIMARY KEY,
text VARCHAR(500) NOT NULL,
completed BOOLEAN DEFAULT FALSE,
created_at TIMESTAMP DEFAULT CURRENT_TIMESTAMP,
updated_at TIMESTAMP DEFAULT CURRENT_TIMESTAMP
);The old JSON file (database.json) is no longer used. The initial data has been migrated to PostgreSQL via the init.sql script.
- Make sure Docker is running
- Check if containers are up:
docker-compose ps - Check logs:
docker-compose logs backend - Verify the backend waits for database health check (configured with
depends_on)
Backend port (3000):
ports:
- "3001:3000" # Use 3001 on host insteadDatabase port (5432):
ports:
- "5433:5432" # Use 5433 on host insteadIf the backend exits immediately:
# Check logs
docker-compose logs backend
# Rebuild the backend
docker-compose up -d --build backendWhen running in Docker, you need to rebuild after code changes:
docker-compose up -d --build backendFor faster development, use the hybrid approach (see Development Modes above).
Run database in Docker, backend locally:
# Terminal 1: Start database only
docker-compose up -d postgres
# Terminal 2: Run backend locally with hot reload
cd backend
npm install
npm run dev # Uses nodemon for auto-reloadImportant: When running backend locally, make sure your backend/.env has:
DB_HOST=localhost
docker-compose up -dCode changes require rebuild:
docker-compose up -d --build backendYou can mount your code as a volume for live reload in Docker:
# Add to backend service in docker-compose.yml
volumes:
- ./backend:/app
- /app/node_modules
command: npm run devFor production deployment:
- Use strong passwords (change in both
docker-compose.ymland.env) - Use environment-specific
.envfiles - Consider using a managed PostgreSQL service (AWS RDS, DigitalOcean, etc.)
- Add proper logging and monitoring
- Implement rate limiting and security headers
- Use connection pooling (already implemented with
pg.Pool)
ISC