A headless SEO content generation system with multi-phase workflow using NestJS, BullMQ, Supabase (PostgreSQL), and Redis.
- API Layer (NestJS): Handles user requests and job status polling
- Worker Layer (BullMQ): Executes external API calls and LLM prompts
- Redis/BullMQ: Orchestrates the multi-step "Content Flow"
- Supabase (PostgreSQL): Persists the state of "Article Drafts" with connection pooling and best practices
- POST
/api/keywords/suggest- Get keyword suggestions from DataForSEO - Automatic difficulty mapping (Low/Medium/High)
- Creates draft with status "RESEARCHING"
- SERP analysis using Serper.dev
- Format identification (Listicles, How-to Guides, Deep-Dive Essays)
- Competitor heading scraping
- Information gain angle identification
- LLM-generated structured JSON outline
- User-editable before approval
- Keyword-optimized sections with intent mapping
- Multi-step sequential writing process
- Introduction generation
- Section-by-section content creation
- Conclusion with CTA
- Keyword presence validation
- Entity density calculation
- GET
/api/drafts/:id/export- Export final content
Before you begin, ensure you have the following installed:
- Node.js (v18 or higher)
- npm or yarn
- Docker (for Redis, optional if you have Redis installed locally)
- PostgreSQL (optional, if not using Supabase)
# Install Node.js dependencies
npm installCreate a .env file in the root directory with the following variables:
# Copy the example file (if it exists) or create a new .env file
touch .envAdd the following to your .env file:
# Application Configuration
NODE_ENV=development
PORT=3002
# DataForSEO API Configuration
# Get your credentials from https://dataforseo.com
# You need either login/password OR auth_token (Base64 encoded login:password)
DATAFORSEO_LOGIN=your_dataforseo_login
DATAFORSEO_PASSWORD=your_dataforseo_password
# OR use auth token (Base64 encoded login:password)
DATAFORSEO_AUTH_TOKEN=your_base64_encoded_auth_token
# Google Gemini API Configuration
# Get your API key from https://makersuite.google.com/app/apikey
GEMINI_API_KEY=your_gemini_api_key
# Optional: Specify Gemini model (default: gemini-3-pro-preview)
GEMINI_MODEL=gemini-3-pro-preview
# Database Configuration (Choose one option below)
# Option 1: Supabase (Recommended for production)
# Get connection strings from Supabase Dashboard → Settings → Database
SUPABASE_DB_DIRECT_URL=postgresql://postgres.[project-ref]:[password]@aws-0-[region].pooler.supabase.com:5432/postgres
SUPABASE_DB_URL=postgresql://postgres.[project-ref]:[password]@aws-0-[region].pooler.supabase.com:6543/postgres
SUPABASE_SSL=true
# Optional: Supabase API credentials (for future features)
SUPABASE_URL=https://[project-ref].supabase.co
SUPABASE_ANON_KEY=your_supabase_anon_key
SUPABASE_SERVICE_ROLE_KEY=your_supabase_service_role_key
# Option 2: Local PostgreSQL (Alternative)
# Uncomment and configure if using local PostgreSQL instead of Supabase
# DATABASE_HOST=localhost
# DATABASE_PORT=5432
# DATABASE_USER=postgres
# DATABASE_PASSWORD=your_postgres_password
# DATABASE_NAME=seo_content
# SUPABASE_SSL=false
# Redis Configuration (for BullMQ job queues)
REDIS_HOST=localhost
REDIS_PORT=6379
# Optional: If your Redis instance requires a password
# REDIS_PASSWORD=your_redis_password
# Optional: Database Connection Pool Settings
DB_POOL_MAX=20
DB_CONNECTION_TIMEOUT=5000
DB_IDLE_TIMEOUT=30000-
DataForSEO API:
- Sign up at https://dataforseo.com
- Navigate to your dashboard
- Get your login credentials or generate an auth token
- For auth token: Base64 encode
login:password(e.g.,echo -n "login:password" | base64)
-
Google Gemini API:
- Visit https://makersuite.google.com/app/apikey
- Sign in with your Google account
- Create a new API key
- Copy the API key to
GEMINI_API_KEY
-
Supabase Database:
- Create a project at https://supabase.com
- Go to Settings → Database
- Copy the connection strings:
- Direct URL (port 5432) →
SUPABASE_DB_DIRECT_URL - Pooler URL (port 6543) →
SUPABASE_DB_URL
- Direct URL (port 5432) →
- See SUPABASE_SETUP.md for detailed instructions
- Create a project at https://supabase.com
- Get your connection strings from Settings → Database
- Add the connection strings to your
.envfile (see above) - The database schema will be automatically created on first run (if
NODE_ENVis notproduction)
For detailed Supabase setup, see SUPABASE_SETUP.md.
# Create the database
createdb seo_content
# Or using psql
psql -U postgres -c "CREATE DATABASE seo_content;"Then configure your .env file with local PostgreSQL credentials (see Option 2 in the environment variables section above).
Redis is required for the BullMQ job queue system. Choose one of the following options:
# Run Redis in a Docker container
docker run -d --name redis -p 6379:6379 redis:alpine
# To stop Redis
docker stop redis
# To start Redis again
docker start redismacOS (using Homebrew):
brew install redis
brew services start redisUbuntu/Debian:
sudo apt-get update
sudo apt-get install redis-server
sudo systemctl start redis-server
sudo systemctl enable redis-serverWindows:
- Download Redis from https://github.com/microsoftarchive/redis/releases
- Or use WSL2 with the Ubuntu instructions above
Before running the application, verify your setup:
# Check if Redis is running
redis-cli ping
# Should return: PONG
# Check if PostgreSQL/Supabase connection is configured
# (The app will show connection errors on startup if not configured)# Development mode (with hot reload)
npm run start:dev
# Production mode
npm run build
npm run start:prodThe application will start on http://localhost:3002 (or the port specified in PORT environment variable).
Note: On first run, the database schema will be automatically created if NODE_ENV is not set to production.
-
POST /api/keywords/suggest- Get keyword suggestions{ "seedKeyword": "your seed keyword" } -
GET /api/keywords/:id- Get keyword details
-
POST /api/drafts- Create a new draft{ "keywordId": "keyword-uuid" } -
GET /api/drafts- List all drafts -
GET /api/drafts/:id- Get draft details -
PUT /api/drafts/:id/outline- Update outline -
PUT /api/drafts/:id/approve-outline- Approve outline and trigger content generation -
GET /api/drafts/:id/export- Export final content
- Keyword Research: User submits seed keyword → Get suggestions → Select keyword → Create draft
- Strategy Analysis: Worker fetches SERP → Scrapes competitors → Analyzes gaps → Saves strategy
- Outline Generation: Worker generates outline → User reviews/edits → User approves
- Content Generation: Worker writes intro → Writes sections sequentially → Writes conclusion → Calculates SEO score
- Export: User exports final markdown content
| Variable | Description | Example |
|---|---|---|
DATAFORSEO_LOGIN |
DataForSEO account login | your_login@example.com |
DATAFORSEO_PASSWORD |
DataForSEO account password | your_password |
DATAFORSEO_AUTH_TOKEN |
Base64 encoded auth token (alternative to login/password) | base64(login:password) |
GEMINI_API_KEY |
Google Gemini API key | AIza... |
SUPABASE_DB_DIRECT_URL |
Supabase direct connection URL (recommended) | postgresql://postgres... |
REDIS_HOST |
Redis server hostname | localhost |
REDIS_PORT |
Redis server port | 6379 |
| Variable | Description | Default |
|---|---|---|
NODE_ENV |
Environment mode (development or production) |
development |
PORT |
Application port | 3002 |
GEMINI_MODEL |
Gemini model to use | gemini-3-pro-preview |
SUPABASE_DB_URL |
Supabase pooler connection URL | - |
SUPABASE_SSL |
Enable SSL for database connection | true |
REDIS_PASSWORD |
Redis password (if required) | - |
DATABASE_HOST |
PostgreSQL host (if not using Supabase) | localhost |
DATABASE_PORT |
PostgreSQL port | 5432 |
DATABASE_USER |
PostgreSQL username | postgres |
DATABASE_PASSWORD |
PostgreSQL password | - |
DATABASE_NAME |
PostgreSQL database name | postgres |
DB_POOL_MAX |
Maximum database connections | 20 |
DB_CONNECTION_TIMEOUT |
Connection timeout in milliseconds | 5000 |
DB_IDLE_TIMEOUT |
Idle timeout in milliseconds | 30000 |
The application loads environment variables from files in this order (later files override earlier ones):
.env.env.local(if exists)
Note: Never commit .env or .env.local files to version control. They contain sensitive credentials.
Error: Connection refused or ECONNREFUSED
Solutions:
- Verify your Supabase connection string is correct
- Check if
SUPABASE_SSL=trueis set for Supabase connections - For local PostgreSQL, ensure the database is running:
pg_isready - Verify database credentials in
.env
Error: Redis connection failed or ECONNREFUSED 127.0.0.1:6379
Solutions:
- Ensure Redis is running:
redis-cli ping(should returnPONG) - Check
REDIS_HOSTandREDIS_PORTin.env - If using Docker, verify the container is running:
docker ps | grep redis - For password-protected Redis, set
REDIS_PASSWORDin.env
Error: 401 Unauthorized or Failed to fetch keyword suggestions
Solutions:
- Verify
DATAFORSEO_LOGINandDATAFORSEO_PASSWORDare correct - Or use
DATAFORSEO_AUTH_TOKEN(Base64 encodedlogin:password) - Check your DataForSEO account has sufficient credits
- Verify API access is enabled in your DataForSEO dashboard
Error: API key not valid or 403 Forbidden
Solutions:
- Verify
GEMINI_API_KEYis correct and not expired - Check API quotas in Google Cloud Console
- Ensure the API key has access to Gemini models
- Try regenerating the API key if issues persist
Error: EADDRINUSE: address already in use :::3002
Solutions:
-
Change the
PORTin.envto a different port (e.g.,3003) -
Or stop the process using the port:
# Find process using port 3002 lsof -i :3002 # macOS/Linux netstat -ano | findstr :3002 # Windows # Kill the process kill -9 <PID> # macOS/Linux taskkill /PID <PID> /F # Windows
Error: Tables don't exist or migration errors
Solutions:
- Ensure
NODE_ENVis not set toproduction(schema auto-creation is disabled in production) - Check database connection is working
- Verify TypeORM has write permissions to the database
- Manually run migrations if needed (see database migration docs)
If you encounter issues not covered here:
- Check the application logs for detailed error messages
- Verify all environment variables are set correctly
- Ensure all prerequisites are installed and running
- Review the SUPABASE_SETUP.md for database-specific issues
The system uses three BullMQ queues:
strategy- Handles SERP analysis and gap identificationoutline- Generates SEO-optimized outlinescontent- Generates long-form content section by section
keywords- Keyword data with difficulty scoresdrafts- Article drafts with status, strategy, outline, and contentsections- Individual sections of articles
MIT