A production-ready Task API server built with Node.js, Express, and SQLite. Features OAuth authentication, API key management, and comprehensive CRUD operations.
This implementation follows the OpenAPI 3.1.0 specification with comprehensive features including hierarchical tasks, tags, priorities, dates, metadata, advanced filtering, and dual authentication methods (OAuth + API keys).
server.js is a self-contained Express.js application that provides:
- RESTful Task API: Full CRUD operations with advanced filtering, sorting, and pagination
- Dual Authentication: OAuth 2.0 (Google/GitHub) for web users + API keys for programmatic access
- User Management: Multi-user support with per-user database isolation
- Per-User Databases: Each user gets their own SQLite database for complete data isolation
- Database Management: Automatic schema creation, WAL mode, and idle connection cleanup
- Import/Export: Bulk operations with validation, conflict resolution, and format support
- Idempotency: Built-in idempotency key support for safe retries
- Security: JWT tokens, bcrypt-hashed API keys, httpOnly cookies, rate limiting, ETag support
- Full CRUD Operations: Create, Read, Update, and Delete tasks
- Hierarchical Tasks: Support for parent-child relationships (subtasks)
- Advanced Filtering: Filter by completion status, priority, tags, dates, and more
- Text Search: Full-text search across title and description
- Sorting: Multiple sort options with customizable order
- Pagination: Efficient pagination with limit and offset
- Optimistic Locking: Support for If-Unmodified-Since and ETag headers
- Tag Management: Multi-tag support with JSON storage
- Automatic Timestamps: Server-managed created_at, updated_at, and completion_date
- Cascading Deletes: Automatically delete subtasks when parent is deleted
- OAuth 2.0 Authentication: Google and GitHub OAuth login with JWT session tokens
- API Keys: Long-lived bearer tokens for programmatic access
- Dual Auth Support: Session cookies for web, bearer tokens for API clients
- Runtime: Node.js
- Framework: Express.js
- Database: SQLite with better-sqlite3 (synchronous API)
- Authentication: Passport.js (OAuth 2.0) + JWT
- Session Management: JSON Web Tokens with database-backed revocation
-
Authentication Layer
- Dual authentication: Session cookies (web) and Bearer tokens (API)
- JWT verification with database token tracking
- bcrypt-hashed API keys with secure comparison
- Automatic token expiration and revocation checking
- Last-used-at tracking for all tokens
- Rate limiting: 5 auth attempts per 15 minutes
-
Database Layer
- Shared database (
data/shared.db): User accounts, OAuth data, auth tokens - Per-user databases (
data/{userId}.db): Task data isolated per user - Auto-initialization with schema creation on startup
- WAL (Write-Ahead Logging) mode for better concurrency
- Idle connection cleanup (closes after 30 minutes of inactivity)
- Foreign key constraints with cascading deletes
- Shared database (
-
API Layer
- RESTful endpoints under
/task/v1/*and/auth/v1/* - Comprehensive input validation (priority, tags, metadata size/depth)
- Rate limiting: 100 task API requests per minute
- Idempotency support for safe retries
- Export/import with multiple format support
- RESTful endpoints under
-
OAuth Integration
- Google OAuth 2.0 Strategy
- GitHub OAuth 2.0 Strategy
- Automatic user creation on first login
- OAuth account linking to user profiles
The system uses two types of databases:
Shared Database (data/shared.db): Contains user accounts and authentication data
users- User profilesoauth_accounts- OAuth provider linkagesauth_tokens- JWT and API key trackingidempotency_keys- Request deduplication cache
Per-User Databases (data/{userId}.db): Each user gets their own database containing:
tasks- All tasks for that user only
This architecture provides complete data isolation between users and allows for efficient per-user backups.
Stores all task items with hierarchical support for a single user.
| Column | Type | Description |
|---|---|---|
id |
TEXT (UUID) | Primary key |
title |
TEXT | Task title (required) |
description |
TEXT | Optional description |
completed |
INTEGER | 0 or 1 (boolean) |
archived |
INTEGER | 0 or 1 (boolean) |
priority |
INTEGER | 0-99, default 0 |
due_date |
TEXT | ISO 8601 timestamp |
tags |
TEXT | JSON array of strings |
metadata |
TEXT | JSON object for custom data |
parent_id |
TEXT | UUID of parent task (for subtasks) |
completion_date |
TEXT | Auto-set when completed=1 |
created_at |
TEXT | ISO 8601 timestamp |
updated_at |
TEXT | ISO 8601 timestamp |
Stores user accounts.
| Column | Type | Description |
|---|---|---|
id |
TEXT (UUID) | Primary key |
email |
TEXT | Unique email address |
name |
TEXT | Display name |
avatar_url |
TEXT | Profile picture URL |
created_at |
TEXT | ISO 8601 timestamp |
updated_at |
TEXT | ISO 8601 timestamp |
Links OAuth providers to users.
| Column | Type | Description |
|---|---|---|
id |
TEXT (UUID) | Primary key |
user_id |
TEXT | Foreign key to users.id |
provider |
TEXT | "google" or "github" |
provider_user_id |
TEXT | ID from OAuth provider |
access_token |
TEXT | OAuth access token (from provider) |
refresh_token |
TEXT | OAuth refresh token (from provider) |
expires_at |
TEXT | Token expiration |
created_at |
TEXT | ISO 8601 timestamp |
updated_at |
TEXT | ISO 8601 timestamp |
Constraints: UNIQUE(provider, provider_user_id), CASCADE DELETE on user_id
Tracks all authentication tokens (JWT and API keys).
| Column | Type | Description |
|---|---|---|
jti |
TEXT | Primary key (JWT ID or unique ID for API keys) |
user_id |
TEXT | Foreign key to users.id |
token_type |
TEXT | "access", "refresh", or "api_key" |
auth_method |
TEXT | "session" or "bearer" |
revoked |
INTEGER | 0 or 1 (boolean) |
device_name |
TEXT | Optional name for API keys |
access_token |
TEXT | bcrypt-hashed API key (for api_key type only) |
token_prefix |
TEXT | First 12 chars of token (unhashed, for indexed lookup) |
expires_at |
TEXT | ISO 8601 timestamp (null for API keys) |
created_at |
TEXT | ISO 8601 timestamp |
last_used_at |
TEXT | Auto-updated on each use |
Constraints: CASCADE DELETE on user_id
Indexes: Indexed on token_prefix for fast O(1) API key lookups
Security Note: API keys are hashed with bcrypt before storage. The plaintext key is only shown once at creation time. The token_prefix is not security-sensitive as the full token is still required and hashed.
Caches responses for idempotent requests.
| Column | Type | Description |
|---|---|---|
key |
TEXT | Primary key (client-provided) |
user_id |
TEXT | Foreign key to users.id |
endpoint |
TEXT | Request path |
request_hash |
TEXT | SHA-256 of request body |
response_status |
INTEGER | HTTP status code |
response_body |
TEXT | JSON response |
created_at |
TEXT | ISO 8601 timestamp |
expires_at |
TEXT | 24-hour TTL |
Constraints: CASCADE DELETE on user_id
- Node.js 16.x or higher
- npm or yarn
npm installRequired packages (from package.json):
express- Web frameworkbetter-sqlite3- SQLite database driverpassport- Authentication middlewarepassport-google-oauth20- Google OAuth strategypassport-github2- GitHub OAuth strategyjsonwebtoken- JWT token generation/verificationbcrypt- Password hashing for API keyscookie-parser- Cookie parsing middlewarecors- Cross-Origin Resource Sharinghelmet- Security headers middlewaremorgan- HTTP request loggerexpress-rate-limit- Rate limiting middlewaredotenv- Environment variable management
The server uses environment variables for configuration. Create a .env file in the project root or set these variables in your environment:
| Variable | Default | Description |
|---|---|---|
PORT |
3003 |
HTTP port the server listens on |
NODE_ENV |
development |
Environment mode: development or production |
JWT_SECRET |
Required | Secret for signing JWT tokens. MUST be 32+ characters! Generate with: node -e "console.log(require('crypto').randomBytes(32).toString('base64'))" |
DB_DIR |
./data |
Directory for database files |
| Variable | Default | Description |
|---|---|---|
CORS_ORIGIN |
http://localhost:3003 |
Comma-separated list of allowed CORS origins (add your client app origins) |
LOG_LEVEL |
info |
Logging level: error, warn, info, or debug |
To enable OAuth login, configure one or both providers:
| Variable | Description | How to Get |
|---|---|---|
GOOGLE_CLIENT_ID |
Google OAuth client ID | Google Cloud Console > APIs & Services > Credentials |
GOOGLE_CLIENT_SECRET |
Google OAuth client secret | Same location as client ID |
| Variable | Description | How to Get |
|---|---|---|
GITHUB_CLIENT_ID |
GitHub OAuth app ID | GitHub Settings > OAuth Apps > New OAuth App |
GITHUB_CLIENT_SECRET |
GitHub OAuth app secret | Same location as app ID |
| Variable | Default | Description |
|---|---|---|
BASE_URL |
http://localhost:{PORT} |
Server base URL for OAuth callbacks. Set to your domain in production (e.g., https://api.example.com) |
Callback URLs to configure in OAuth providers:
- Google:
{BASE_URL}/auth/v1/oauth/google/callback - GitHub:
{BASE_URL}/auth/v1/oauth/github/callback
- Go to Google Cloud Console
- Create a new project (or select existing)
- Enable Google+ API
- Go to Credentials → Create Credentials → OAuth 2.0 Client ID
- Application type: Web application
- Add authorized redirect URIs:
- Development:
http://localhost:3003/auth/v1/oauth/google/callback - Production:
https://yourdomain.com/auth/v1/oauth/google/callback
- Development:
- Copy the Client ID and Client Secret to your
.envfile
- Go to GitHub Developer Settings
- Click New OAuth App
- Fill in the details:
- Application name: Your app name
- Homepage URL:
http://localhost:3003(or your production URL) - Authorization callback URL:
http://localhost:3003/auth/v1/oauth/github/callback
- Click Register application
- Copy the Client ID
- Generate a new Client Secret
- Copy both to your
.envfile
The server uses a two-tier database architecture:
| File(s) | Description |
|---|---|
./data/shared.db |
Shared database for users, OAuth, and auth tokens (auto-created) |
./data/shared.db-shm |
Shared memory file for shared database |
./data/shared.db-wal |
Write-ahead log for shared database |
./data/{userId}.db |
Per-user database containing tasks (created on first user login) |
./data/{userId}.db-shm |
Shared memory file for user database |
./data/{userId}.db-wal |
Write-ahead log for user database |
Notes:
- All databases are automatically created when needed
- Idle database connections are closed after 30 minutes of inactivity
- Each user's task data is completely isolated in their own database file
- Backup strategy: Back up both
shared.dband all user databases
# Server Configuration
PORT=3003
NODE_ENV=development
JWT_SECRET=your-super-secret-key-min-32-characters-long
# Base URL for OAuth callbacks (update for production)
BASE_URL=http://localhost:3003
# Database directory (per-user databases stored here)
DB_DIR=./data
# CORS Configuration (comma-separated origins)
CORS_ORIGIN=http://localhost:3003
# Logging level (error, warn, info, debug)
LOG_LEVEL=info
# Google OAuth (optional)
GOOGLE_CLIENT_ID=your-google-client-id
GOOGLE_CLIENT_SECRET=your-google-client-secret
# GitHub OAuth (optional)
GITHUB_CLIENT_ID=your-github-client-id
GITHUB_CLIENT_SECRET=your-github-client-secretThe API supports two authentication methods:
Login with Google or GitHub to get session-based authentication:
# 1. Configure OAuth credentials
cp .env.example .env
# Edit .env with your Google/GitHub OAuth credentials (see Setting Up OAuth Providers above)
# 2. Start the server
node server.js
# 3. Visit in browser or redirect users to:
http://localhost:3003/auth/v1/oauth/google/login
# or
http://localhost:3003/auth/v1/oauth/github/loginAfter login, you'll be redirected with session cookies set automatically. The cookies are httpOnly and secure (in production), containing:
- Access Token: Short-lived (15 minutes), used for API requests
- Refresh Token: Long-lived (30 days), used to obtain new access tokens
Generate API keys for programmatic access:
# 1. First, login via OAuth (web browser) to create a session
# 2. Use the web UI or API to create an API key (requires authentication):
curl -X POST http://localhost:3003/auth/v1/api-keys \
-H "Cookie: session_token=YOUR_SESSION_TOKEN" \
-H "Content-Type: application/json" \
-d '{"name": "My Script"}'
# 3. Save the returned token (shown only once!)
# 4. Use the API key in requests
curl -H "Authorization: Bearer YOUR_API_KEY" \
http://localhost:3003/task/v1/tasksAPI Key Features:
- No expiration (revoke manually when needed)
- bcrypt-hashed with token prefix indexing for fast lookup
- Can be named for easy identification
- Manage via
/auth/v1/api-keysendpoints
node server.jsWhen you run node server.js, the server:
- Validates JWT_SECRET - Ensures it's set and at least 32 characters (exits if invalid)
- Loads environment variables from
.envfile (if present) - Creates data directory if it doesn't exist (
./data/by default) - Initializes shared database at
./data/shared.db(creates if doesn't exist) - Creates shared tables if they don't exist (users, oauth_accounts, auth_tokens, idempotency_keys)
- Enables WAL mode on shared database for better concurrent read/write performance
- Configures Passport strategies for OAuth (Google/GitHub) if credentials are provided
- Starts Express server on the configured port with security middleware (helmet, CORS, rate limiting)
- Starts idle connection cleanup - Checks every 10 minutes and closes inactive databases
- Logs startup message with configuration details
Note: Per-user databases are created lazily when each user first logs in.
Task API server running on port 3003
Base URL: http://localhost:3003/task/v1
- Task API Base:
http://localhost:3003/task/v1 - Auth API Base:
http://localhost:3003/auth/v1 - Health Check: GET any endpoint (returns appropriate response or 404)
# Set NODE_ENV for production optimizations
NODE_ENV=production node server.js
# Or use a process manager like PM2
pm2 start server.js --name task-apiProduction Checklist:
- ✅ Set strong
JWT_SECRET(32+ random characters) - Server enforces this on startup - ✅ Set
NODE_ENV=production - ✅ Configure
BASE_URLto your domain - ✅ Configure
CORS_ORIGINfor your frontend domains only - ✅ Enable HTTPS (use nginx/Apache as reverse proxy)
- ✅ Backup database directory regularly (both
shared.dband all{userId}.dbfiles) - ✅ Monitor server logs for errors and security events
- ✅ Consider adjusting rate limits based on your traffic patterns
- ✅ Review and adjust
DB_IDLE_TIMEOUTif needed (default: 30 minutes) - ✅ Set appropriate
LOG_LEVELfor production (recommendinfoorwarn)
All authentication endpoints are under /auth/v1:
| Endpoint | Method | Auth Required | Description |
|---|---|---|---|
/auth/v1/oauth/google/login |
GET | No | Initiates Google OAuth flow |
/auth/v1/oauth/google/callback |
GET | No | Google OAuth callback (auto-handled) |
/auth/v1/oauth/github/login |
GET | No | Initiates GitHub OAuth flow |
/auth/v1/oauth/github/callback |
GET | No | GitHub OAuth callback (auto-handled) |
Flow: Visit login URL → Redirected to provider → Login → Redirected back to app with session cookie set
| Endpoint | Method | Auth Required | Description |
|---|---|---|---|
/auth/v1/me |
GET | Yes | Get current user profile |
/auth/v1/logout |
POST | Yes | Revoke tokens and clear cookies |
/auth/v1/refresh |
POST | No | Refresh access token using refresh token |
| Endpoint | Method | Auth Required | Description |
|---|---|---|---|
/auth/v1/api-keys |
GET | Yes | List all API keys for current user |
/auth/v1/api-keys |
POST | Yes | Create new API key (returns token once) |
/auth/v1/api-keys/:jti |
DELETE | Yes | Revoke an API key |
Create API Key Request Body:
{
"name": "My CLI Tool"
}Response:
{
"token": "abc123...",
"name": "My CLI Tool",
"created_at": "2025-01-15T10:30:00Z",
"message": "API key created successfully. Save it securely - it cannot be retrieved later!"
}All task endpoints are under /task/v1 and require authentication (session cookie or bearer token):
Authentication Header (for API keys and mobile apps):
Authorization: Bearer your-api-key-here
Session Cookie (automatically sent by browsers after OAuth login):
Cookie: session_token=jwt-token-here
Create a new task item.
Request Body:
{
"title": "Buy groceries",
"description": "Milk, eggs, bread",
"priority": 50,
"due_date": "2025-12-31T23:59:59Z",
"tags": ["shopping", "urgent"],
"metadata": { "store": "Whole Foods" },
"completed": false,
"archived": false,
"parent_id": null
}Response: 201 Created with Location header
Get a single task by ID.
Response: 200 OK with ETag header
List tasks with filtering, sorting, and pagination.
Query Parameters:
completed: Filter by completion status (true/false)archived: Filter by archived status (true/false)priority: Filter by priority level (0-99)tag: Filter by tags (can be repeated for AND logic)parent_id: Filter by parent ("null" for root-level, UUID for subtasks)search: Full-text search in title and descriptiondue_before: Filter tasks due before date (ISO 8601)due_after: Filter tasks due after date (ISO 8601)overdue: Filter overdue incomplete tasks (true)sort_by: Sort field (created_at, updated_at, title, priority, due_date, completed)order: Sort order (asc, desc)limit: Max items to return (1-1000, default 100)offset: Items to skip (default 0)
Response: 200 OK with pagination metadata
Partially update a task.
Headers (optional):
If-Unmodified-Since: Timestamp for optimistic locking
Request Body: (all fields optional)
{
"title": "Updated title",
"completed": true,
"due_date": null
}Response: 200 OK
Delete a task and all its subtasks (cascading).
Response: 204 No Content
Export all tasks for the authenticated user.
Headers (optional):
Accept: application/x-ndjson- Return newline-delimited JSON instead of array
Response (default): 200 OK with JSON array
[
{ "id": "...", "title": "...", ... },
{ "id": "...", "title": "...", ... }
]Response (with Accept: application/x-ndjson):
{"id":"...","title":"...",...}
{"id":"...","title":"...",...}
Behavior:
- Returns tasks in hierarchical order (parents before children)
- Includes all fields for each task
- Useful for backup, migration, or bulk editing
Import tasks in bulk.
Headers (optional):
Idempotency-Key: unique-key-123- Prevent duplicate imports
Query Parameters:
validate_only=true- Validate without importing (dry run)on_conflict=fail|skip|upsert- How to handle existing IDs (default: fail)fail: Return 409 if any ID existsskip: Skip tasks with existing IDsupsert: Update existing tasks, insert new ones
Request Body:
[
{
"id": "uuid-1",
"title": "Task 1",
"completed": false,
"priority": 50,
...
},
{
"id": "uuid-2",
"title": "Task 2",
"parent_id": "uuid-1",
...
}
]Response (success): 201 Created
{
"imported_count": 10,
"inserted": 8,
"updated": 2,
"skipped": 0
}Response (validation errors): 422 Unprocessable Entity
{
"code": "VALIDATION_ERROR",
"message": "Validation errors found in task items",
"validation_errors": [
{
"index": 0,
"id": "uuid-1",
"errors": ["title is required"]
}
]
}Response (conflict): 409 Conflict (only when on_conflict=fail)
{
"code": "CONFLICT",
"message": "One or more task IDs already exist",
"conflicting_ids": ["uuid-1", "uuid-2"]
}Features:
- Validates all tasks before importing (atomic operation)
- Detects circular parent-child dependencies
- Maintains hierarchical order (parents imported before children)
- Supports idempotency for safe retries
- Preserves all fields including timestamps
- When
completedis set totrue,completion_dateis automatically set to the current UTC time - When
completedis set tofalse,completion_dateis cleared tonull completion_dateis server-managed and cannot be set directly by clients
- Tags are stored as JSON arrays in the database
- Tag filtering is case-insensitive
If parent_id query parameter is omitted on GET /tasks, only root-level tasks are returned (same as parent_id=null).
To clear a nullable field in PATCH, send it with null value:
{
"due_date": null,
"description": null
}Server supports idempotency for POST endpoints to prevent duplicate operations during retries:
Usage:
curl -X POST http://localhost:3003/task/v1/tasks \
-H "Authorization: Bearer your-api-key" \
-H "Idempotency-Key: unique-request-id-123" \
-H "Content-Type: application/json" \
-d '{"title": "Buy milk"}'Behavior:
- First request with a key: Processed normally, response cached for 24 hours
- Subsequent requests with same key and body: Returns cached response (same status code and body)
- Same key but different body: Returns 422 error
- After 24 hours: Key expires, request processed as new
Supported Endpoints:
POST /task/v1/tasks- Create taskPOST /task/v1/tasks/import- Import tasks
Key Requirements:
- Must be 1-200 characters long
- Should be unique per request (e.g., UUID, timestamp-based)
- Stored per user (different users can use same key)
All task data is isolated per authenticated user:
- Each user can only see/modify their own tasks
- User ID is extracted from authentication token (session or API key)
- Database queries automatically filter by
user_id - Impossible to access another user's data without their token
| Token Type | Lifetime | Renewable |
|---|---|---|
| Access Token (JWT) | 15 minutes | Yes (via refresh token) |
| Refresh Token (JWT) | 30 days | No (must re-login) |
| API Key | No expiration | No (revoke manually) |
Automatic behaviors:
- Access tokens expire after 15 minutes of inactivity
- Refresh tokens can be used to get new access tokens
last_used_atupdated on every authenticated request- Expired/revoked tokens return 401 Unauthorized
The server returns consistent error responses with appropriate HTTP status codes:
{
"code": "ERROR_CODE",
"message": "Human-readable error message",
"field_errors": [...] // Optional, for validation errors
}| Code | Name | When Used |
|---|---|---|
| 200 | OK | Successful GET/PATCH request |
| 201 | Created | Successful POST request |
| 204 | No Content | Successful DELETE request |
| 400 | Bad Request | Invalid request format or parameters |
| 401 | Unauthorized | Missing, invalid, or expired authentication |
| 404 | Not Found | Resource doesn't exist |
| 409 | Conflict | Resource conflict (e.g., duplicate ID on import) |
| 412 | Precondition Failed | If-Unmodified-Since check failed |
| 422 | Unprocessable Entity | Validation errors |
| 500 | Server Error | Internal server error |
| Code | Description |
|---|---|
UNAUTHORIZED |
Authentication required or token invalid |
NOT_FOUND |
Task or user not found |
VALIDATION_ERROR |
Input validation failed |
CONFLICT |
Resource already exists |
PRECONDITION_FAILED |
Conditional update failed |
SERVER_ERROR |
Internal error (check logs) |
401 Unauthorized:
{
"code": "UNAUTHORIZED",
"message": "Authentication required"
}422 Validation Error:
{
"code": "VALIDATION_ERROR",
"message": "Title is required",
"field_errors": [
{
"field": "title",
"message": "Title is required and must not be empty"
}
]
}# 1. Get an API key (after OAuth login via browser)
curl http://localhost:3003/auth/v1/oauth/google/login
# (Complete login in browser)
# 2. Create an API key via the web UI or using session cookie
# For this example, assume you have an API key: abc123...
# 3. Create a task
curl -X POST http://localhost:3003/task/v1/tasks \
-H "Authorization: Bearer abc123..." \
-H "Content-Type: application/json" \
-d '{
"title": "Buy groceries",
"description": "Milk, eggs, bread",
"priority": 50,
"due_date": "2025-12-31T23:59:59Z",
"tags": ["shopping", "urgent"]
}'
# 4. List all tasks
curl -H "Authorization: Bearer abc123..." \
http://localhost:3003/task/v1/tasks
# 5. Filter incomplete high-priority tasks
curl -H "Authorization: Bearer abc123..." \
"http://localhost:3003/task/v1/tasks?completed=false&priority=50&sort_by=due_date&order=asc"
# 6. Update a task
curl -X PATCH http://localhost:3003/task/v1/tasks/task-id-here \
-H "Authorization: Bearer abc123..." \
-H "Content-Type: application/json" \
-d '{"completed": true}'
# 7. Export all tasks
curl -H "Authorization: Bearer abc123..." \
http://localhost:3003/task/v1/tasks/export > tasks-backup.json
# 8. Import tasks
curl -X POST http://localhost:3003/task/v1/tasks/import \
-H "Authorization: Bearer abc123..." \
-H "Content-Type: application/json" \
-d @tasks-backup.jsonconst fetch = require('node-fetch');
const API_BASE = 'http://localhost:3003';
const API_KEY = 'your-api-key-here';
async function createTask(title, description) {
const response = await fetch(`${API_BASE}/task/v1/tasks`, {
method: 'POST',
headers: {
'Authorization': `Bearer ${API_KEY}`,
'Content-Type': 'application/json',
},
body: JSON.stringify({ title, description }),
});
return await response.json();
}
async function listTasks(filters = {}) {
const params = new URLSearchParams(filters);
const response = await fetch(`${API_BASE}/task/v1/tasks?${params}`, {
headers: {
'Authorization': `Bearer ${API_KEY}`,
},
});
return await response.json();
}
// Usage
(async () => {
const task = await createTask('Learn Express.js', 'Build a REST API');
console.log('Created:', task);
const tasks = await listTasks({ completed: false });
console.log('Incomplete tasks:', tasks.data);
})();import requests
API_BASE = 'http://localhost:3003'
API_KEY = 'your-api-key-here'
HEADERS = {
'Authorization': f'Bearer {API_KEY}',
'Content-Type': 'application/json'
}
# Create a task
response = requests.post(
f'{API_BASE}/task/v1/tasks',
headers=HEADERS,
json={
'title': 'Complete project',
'priority': 90,
'tags': ['work', 'important']
}
)
task = response.json()
print(f"Created task: {task['id']}")
# List tasks with filters
response = requests.get(
f'{API_BASE}/task/v1/tasks',
headers=HEADERS,
params={'completed': 'false', 'sort_by': 'priority', 'order': 'desc'}
)
tasks = response.json()
print(f"Found {tasks['total']} tasks")
# Update a task
response = requests.patch(
f'{API_BASE}/task/v1/tasks/{task["id"]}',
headers=HEADERS,
json={'completed': True}
)
updated_task = response.json()
print(f"Completed: {updated_task['completion_date']}")You can test the API using curl, Postman, or any HTTP client:
# Example: Test health endpoint
curl http://localhost:3003/health
# Example: Create a task (requires authentication)
curl -X POST http://localhost:3003/task/v1/tasks \
-H "Authorization: Bearer YOUR_API_KEY" \
-H "Content-Type: application/json" \
-d '{"title": "Test task"}'For comprehensive testing, consider using tools like:
- Postman collections
- REST Client VS Code extension
- Custom test scripts with your preferred testing framework
TaskProvider/
├── server.js # Main Express.js server (~1700 lines)
│ ├── Environment validation and setup
│ ├── Security middleware (helmet, CORS, rate limiting)
│ ├── Per-user database management with idle cleanup
│ ├── Shared database initialization
│ ├── Passport OAuth strategies (Google, GitHub)
│ ├── Authentication middleware (dual auth: sessions + API keys)
│ ├── Token prefix indexing for fast API key lookups
│ ├── ETag generation and validation
│ ├── Input validation helpers
│ ├── Helper functions (UUID generation, task conversion, filtering)
│ ├── Idempotency middleware
│ ├── OAuth routes with configurable redirect
│ ├── Auth API endpoints (login, logout, API key management)
│ ├── Task API endpoints (CRUD, import/export)
│ ├── Health check with database connectivity test
│ └── Graceful shutdown handler
│
├── data/ # Database directory (auto-created)
│ ├── shared.db # Shared database (users, auth, OAuth)
│ ├── shared.db-wal # Write-ahead log for shared DB
│ ├── shared.db-shm # Shared memory for shared DB
│ ├── {userId}.db # Per-user task database (created per user)
│ ├── {userId}.db-wal # Write-ahead log for user DB
│ └── {userId}.db-shm # Shared memory for user DB
│
├── public/ # Static files directory (optional)
│ └── index.html # Landing page with API info
│
├── package.json # Node.js dependencies
├── package-lock.json # Locked dependency versions
├── .env.example # Example environment config
├── .env # Your config (create this, gitignored)
├── .gitignore # Git ignore rules
│
├── README.md # This file
└── openapi.yaml # OpenAPI 3.1.0 specification
server.js - The heart of the application. Single-file Express server that:
- Validates environment configuration on startup (enforces JWT_SECRET requirements)
- Sets up per-user SQLite databases with automatic schema creation
- Implements idle database connection cleanup (30-minute timeout)
- Configures Passport.js for Google and GitHub OAuth
- Implements dual authentication: JWT sessions + bcrypt-hashed API keys with prefix indexing
- Provides comprehensive input validation for all task fields
- Provides RESTful Task API with full CRUD operations
- Implements ETag support for caching and optimistic concurrency
- Implements rate limiting (5/15min for auth, 100/min for tasks)
- Handles user management and API key generation
- Implements idempotency for safe request retries
- Health check with database connectivity testing
- Graceful shutdown with proper database cleanup
openapi.yaml - OpenAPI 3.1.0 specification that:
- Documents all task API endpoints under
/task/v1/* - Defines request/response schemas
- Describes authentication requirements
- Provides examples for all operations
- Note: Auth endpoints (
/auth/v1/*) are implemented but not in spec
public/index.html - Simple landing page that:
- Welcomes users to the API
- Provides links to documentation
- Shows OAuth login options
- Displays server status and API information
Tokens are stored in the auth_tokens table in the shared SQLite database with the following features:
- Token Generation: Cryptographically secure random tokens (32 bytes, base64url encoded)
- Named Tokens: Associate API keys with descriptive names for easy management
- Expiration Support: JWT tokens expire (15min access, 30d refresh); API keys never expire
- Usage Tracking: Automatically tracks last_used_at timestamp on each request
- Token Validation: Server validates tokens against the database and checks expiration/revocation
- Performance Optimization: Token prefix indexing enables O(1) API key lookups instead of O(n)
- API Keys: Hashed with bcrypt (10 rounds) before storage - plaintext never stored
- Token Prefix: First 12 characters stored unhashed for indexed lookup (not security-sensitive)
- JWT Tokens: Signed with JWT_SECRET and tracked in database for revocation
- Database Cleanup: Idle database connections closed after 30 minutes
- Expired tokens are automatically rejected by the server
- API keys cannot be retrieved after creation - only listed by name and metadata
- Delete tokens immediately when they're no longer needed
Error: Cannot find module 'express'
# Solution: Install dependencies
npm installError: EADDRINUSE: address already in use
# Solution: Kill process on port 3003
lsof -ti:3003 | xargs kill -9
# Or change port
PORT=3004 node server.jsError: Redirect URI mismatch
- Check that
BASE_URLin .env matches your OAuth app settings - Ensure callback URLs are configured correctly:
- Google:
{BASE_URL}/auth/v1/oauth/google/callback - GitHub:
{BASE_URL}/auth/v1/oauth/github/callback
- Google:
OAuth providers not available
- Check that
GOOGLE_CLIENT_IDandGOOGLE_CLIENT_SECRETare set (for Google) - Check that
GITHUB_CLIENT_IDandGITHUB_CLIENT_SECRETare set (for GitHub) - Restart server after updating .env file
Error: 401 Unauthorized
- Verify your API key is correct
- Check that token hasn't been revoked
- For session cookies, ensure you're logged in via OAuth
- Access tokens expire after 15 minutes - use refresh token
Error: Invalid or expired token
- Access tokens expire after 15 minutes
- Use refresh token endpoint to get new access token
- Or login again via OAuth
Error: database is locked
- SQLite is in WAL mode which should prevent this
- Check file permissions on database files in
./data/ - Ensure no other process is accessing the databases
- Idle connections are automatically closed after 30 minutes
Database corruption
# Backup and rebuild (do this for both shared and user databases)
cp data/shared.db data/shared.db.backup
sqlite3 data/shared.db "VACUUM;"
# For user databases
cp data/USER_ID.db data/USER_ID.db.backup
sqlite3 data/USER_ID.db "VACUUM;"High memory usage
- The server automatically closes idle database connections after 30 minutes
- Check the number of active users and open connections
- Consider adjusting DB_IDLE_TIMEOUT if needed
Enable verbose logging:
// Add to server.js after line 24 (app.use(cookieParser());)
app.use((req, res, next) => {
console.log(`${req.method} ${req.path}`, req.query, req.user?.email);
next();
});- Database Indexes - For large datasets, add indexes:
CREATE INDEX idx_tasks_user_completed ON tasks(user_id, completed);
CREATE INDEX idx_tasks_user_priority ON tasks(user_id, priority);
CREATE INDEX idx_tasks_due_date ON tasks(due_date);-
Connection Pooling - For high traffic, consider using better-sqlite3 in read-only mode for GET requests
-
Caching - Add Redis for token caching and rate limiting
-
Monitoring - Use tools like PM2 for process management and monitoring
- JWT Secret - Server enforces 32+ character requirement on startup ✅
- API Key Hashing - API keys are bcrypt-hashed before storage ✅
- Rate Limiting - Enabled for both auth (5/15min) and task (100/min) endpoints ✅
- Input Validation - Comprehensive validation for all task fields ✅
- Use HTTPS in production - Set up reverse proxy (nginx/Apache)
- Regular backups - Backup entire
./data/directory (includes all databases) - Monitor logs - Watch for failed auth attempts and rate limit violations
- Rotate API keys - Revoke unused keys regularly via
/auth/v1/api-keys/:jti - CORS Configuration - Set
CORS_ORIGINto your specific domains only - Environment Variables - Never commit
.envfile to version control
- Review the Configuration and Authentication Setup sections above
- Check server logs for error messages
- Test the health endpoint:
curl http://localhost:3003/health - Use the
/auth/v1/meendpoint to verify authentication is working
Q: Can I use this in production? A: Yes! The server includes production-ready features:
- Enforced JWT_SECRET validation (32+ chars required)
- bcrypt-hashed API keys with indexed lookups
- Rate limiting on all endpoints
- Comprehensive input validation
- ETag support for caching
- Idle connection cleanup
- Security headers (helmet)
- Graceful shutdown
- Database health monitoring
Just ensure you also: Set NODE_ENV=production, use HTTPS, backup regularly, and monitor logs.
Q: How do I reset the database?
A: Delete the entire ./data/ directory and restart server. New databases will be created automatically.
Q: How do I backup the database?
A: Backup the entire ./data/ directory, which contains:
shared.db(user accounts and auth)- All
{userId}.dbfiles (per-user task data)
Q: Can I use MySQL/PostgreSQL instead of SQLite? A: The code uses better-sqlite3 which is SQLite-specific. You'd need to rewrite database code to use a different driver (e.g., pg for PostgreSQL).
Q: How many users can this handle? A: The per-user database architecture scales well. Each user has their own database file, eliminating contention. SQLite can handle thousands of concurrent readers. The idle connection cleanup prevents memory issues.
Q: Can I deploy this to Vercel/Netlify? A: Serverless platforms don't support SQLite well (no persistent filesystem). Use platforms with persistent storage (Heroku, DigitalOcean, AWS EC2, Render) or switch to PostgreSQL.
Q: Is the data encrypted? A: API keys are bcrypt-hashed. Database files are not encrypted at rest. For sensitive data, use SQLite encryption extensions (SQLCipher) or full-disk encryption at the OS level.
Q: What happens when a user is deleted?
A: The ON DELETE CASCADE constraint removes OAuth accounts and auth tokens from the shared database. However, the user's database file ({userId}.db) remains on disk and must be manually deleted if desired.
MIT