Skip to content

GWC-Data/node-server-engine

 
 

Repository files navigation

Node Server Engine

Framework used to develop Node backend services. This package ships with a lot of features to standardize the creation of services, letting you focus on the business logic.

npm version License: ISC semantic-release

Features

  • 🚀 Express-based - Built on the popular Express.js framework (v4.22)
  • 🔒 Multiple Auth Methods - JWT, mTLS, HMAC, and Static token authentication
  • 🔌 WebSocket Support - Built-in WebSocket server with message handling (ws v8.18)
  • 📊 Database Integration - Sequelize ORM with migrations support (v6.37)
  • 📡 Pub/Sub - Google Cloud Pub/Sub integration (v4.11)
  • 🔔 Push Notifications - Built-in push notification support
  • 🌐 i18n - Internationalization with translation management
  • 🔍 ElasticSearch - Full-text search integration (v9.2) with auto-migrations
  • 💾 Redis - Advanced Redis client with retry logic, TLS support (ioredis v5.8)
  • Secret Manager - GCP Secret Manager integration for secure credential management
  • �📝 API Documentation - Swagger/OpenAPI documentation support
  • 📤 File Uploads - Single and chunked file upload middleware with validation
  • 🧪 TypeScript - Written in TypeScript with full type definitions
  • Modern Tooling - ESLint, Prettier, and automated versioning
  • 🛡️ Permission System - Role-based access control with case-insensitive matching
  • 🔐 Security - HMAC authentication, TLS/mTLS support, input validation

Requirements

Install

To start a new service, it is highly recommended that you clone it from our template. It will already include all the necessary tools and boilerplate.

If you need to install it manually:

npm install node-server-engine

For development dependencies:

npm install --save-dev backend-test-tools

Logging

The server provides structured logging with automatic format detection. In local development, logs are colorized and human-readable. In production (GCP, Kubernetes), logs are JSON formatted for log aggregation systems.

Log Format

Logs automatically detect the environment:

  • Local Development: Colorized, concise format with time (HH:MM:SS)
  • Production: JSON structured logs for cloud log aggregation

Environment Variables

Variable Values Description Default
LOG_FORMAT local, json Force specific log format Auto-detect
DETAILED_LOGS true, false Show stack traces and verbose details false
DEBUG namespace:* Enable debug logs for specific namespaces Off

Examples

Default Local Format (clean, concise):

[21:16:15] INFO     SERVER_RUNNING
[21:16:15] INFO     Connected to database successfully
[21:16:15] DEBUG    POST /auth/login [200] 154ms
[21:20:15] DEBUG    GET /users [304] 10ms
[21:20:15] WARNING  No bearer token found [unauthorized 401] GET /users

Detailed Logs (DETAILED_LOGS=true):

[21:16:15] DEBUG    POST /auth/login [200] 154ms
Data:
{
  "responseTime": "154ms",
  "contentLength": "1012",
  "httpVersion": "1.1"
}

[21:20:15] WARNING  No bearer token found [unauthorized 401] GET /users
Stack Trace:
Error: No bearer token found
    at middleware (/path/to/authJwt.ts:31:13)
    ...
  src/middleware/authJwt/authJwt.ts:31

Production Format (LOG_FORMAT=json):

{"severity":"INFO","message":"SERVER_RUNNING","timestamp":"2025-12-09T21:16:15.028Z","serviceContext":{"service":"my-service","version":"1.0.0"}}
{"severity":"DEBUG","message":"POST /auth/login [200] 154ms","timestamp":"2025-12-09T21:16:15.182Z"}

Usage in Code

import { reportInfo, reportError, reportDebug } from 'node-server-engine';

// Info logging
reportInfo('Server started successfully');
reportInfo({ message: 'User login', data: { userId: '123' } });

// Error logging
reportError(error);
reportError(error, request); // Include HTTP request context

// Debug logging (requires DEBUG env var)
reportDebug({ namespace: 'app:auth', message: 'Token validated' });

Entities

Server

The server class encapsulates all of the express boilerplate. Instantiate one per service and initialize it to get started.

import { Server } from 'node-server-engine';
const server = new Server(config);
server.init();

Server Configuration

Property Type Behavior Default
port Number Port to listen to process.env.PORT
endpoints Array<Endpoint> List of endpoints that should be served. []
globalMiddleware Array<Function>
Array<{middleware: Function, path: String}>
List of middlewares that should be executed for each endpoint's logic. If it is given as an object with a path, it will only be applied to requests with this base path. []
errorMiddleware Array<Function>
Array<{middleware: Function, path: String}>
List of middlewares that should be executed after each endpoint's logic. If it is given as an object with a path, it will only be applied to requests with this base path. []
initCallbacks Array<Function> List of functions that are called on server start
[]
syncCallbacks boolean Forces the init callbacks to run one after the other and not in parallel
false
cron Array<Object> List of cronjob that are called on server start []
shutdownCallbacks Array<Function> List of functions that are called on server shutdown []
checkEnvironment Object Schema against which verify the environment variables, will cause server termination if the environment variables are not properly set. {}
secretManager SecretManagerOptions Configuration for GCP Secret Manager integration. Loads secrets at startup before any other initialization. undefined
webSocket.server Object Settings to create a WebSocket server. See the ws package documentation for details.
webSocket.client Object Settings passed down to SocketClient when a new socket connection is established.

Endpoint

Endpoint encapsulates the logic of a single endpoint in a standard way.

The main function of the endpoint is called the handler. This function should only handle pure business logic for a given endpoint.

An endpoint usually has a validator. A validator is a schema that will be compared to the incoming request. The request will be denied if it contains illegal or malformed arguments. For more detail see the documentation of the underlying package express-validator.

import { Endpoint, EndpointMethod } from 'node-server-engine';

// A basic handler that returns an HTTP status code of 200 to the client
function handler(req, res) {
  res.sendStatus(200);
}

// The request must contain `id` as a query string and it must be a UUID V4
const validator = {
  id: {
    in: 'query',
    isUUID: {
      options: 4
    }
  }
};

// This endpoint can be passed to the Server
new Endpoint({
  path: '/demo',
  method: EndpointMethod.GET,
  handler,
  validator
});

Endpoint Configuration

new Endpoint(config)

Property Type Behavior Default
path String Path to which the endpoint should be served required
method Method Method to which the endpoint should be served required
handler Function Endpoint handler required
validator Object Schema to validate against the request. See documentation for more details.
authType AuthType Authentication to use for this endpoint AuthType.NONE
authParams Object Options specific to the authentication methods {}
files Array<Object> Configuration to upload files. See the specific documentation. {}
middleware Array<Function> List of middlewares to run before the handler. []
errorMiddleware Array<Function> List of middlewares to run after the handler []
const addNodeEndpoint = new Endpoint({
  path: '/note',
  method: EndpointMethod.POST,
  handler: (req, res, next) => res.json(addNote(req.body)),
  authType: EndpointAuthType.AUTH_JWT,
  middleware: [checkPermission(['DeleteUser', 'AdminAccess'])],
  errorMiddleware: [addNoteErrorMiddleware],
  validator: {
    id: {
      in: 'body',
      isUUID: true
    },
    content: {
      in: 'body',
      isLength: {
        errorMessage: 'content to long',
        options: {
          max: 150
        }
      }
    }
  }
});

Methods

The following HTTP methods are supported

  • Method.GET
  • Method.POST
  • Method.PUT
  • Method.PATCH
  • Method.DELETE
  • Method.ALL - respond to all requests on a path

Authentication

Endpoints can take an authType and authParam in their configuration to determine their authentication behavior. The following table summarizes their usage.

The server engine exposes an enumeration for auth types. import {AuthType} from 'node-server-engine';

AuthType Description AuthParams
AuthType.NONE No authentication. All requests are handled
AuthType.JWT A valid JSON Web Token is required as Bearer token. To be valid it must be properly signed by auth0, and its payload must match with what is set as environment variables.
The user's ID is added to req.user.
AuthType.TLS Authenticate through mutual TLS. CA and an optional list of white listed host names should be set in the environment variables. whitelist [Array]: List of certificate Common Name or Alt Name that are permitted to make requests to this endpoint.
AuthType.HMAC Authenticate with a signature in the payload.
This authentication is deprecated and should be avoided
secret [String]: Overrides the secret used for signatures set in environment variables
isGithub [Boolean]: The request is a Github webhook and therefore uses their signature system and not the default one.
AuthType.STATIC A valid shared Bearer token is required. Shared token is stored in env variable (STATIC_TOKEN)

File Upload Middleware

This middleware handles multipart file uploads in an Express application. It processes files in memory, validates them based on configuration options, and ensures that required files are uploaded.

Usage

The request must be made using multipart/form-data. The uploaded files will be available in req.files.

The following settings can be used on each object the Endpoint's options.files.

Property Type Description Default
key string Form key as which the file should be fetched required
maxSize string Maximum file size in a human readable format (ex: 5MB) required
mimeTypes Array<string> A list of accepted MIME Types []
required boolean Will fail the request and not store the files if this one is missing false
noExtension boolean Store the file with no extension false
Example
import { body } from 'express-validator';
import { Endpoint, middleware, AuthType, Method } from 'node-server-engine';

const filesConfig = [
  { key: 'avatar', mimeTypes: ['image/png', 'image/jpeg'], required: true },
  { key: 'document', mimeTypes: ['application/pdf'], maxSize: '5MB' }
];

new Endpoint({
  path: '/upload',
  method: Method.POST,
  authType: AuthType.JWT,
  files: filesConfig,
  handler: (req, res) => {
    res.json({ message: 'Files uploaded successfully', files: req.files });
  }
});
Middleware Output

The middleware adds a files object to the req object, which contains information about the uploaded file.

 [
    {
      "fieldname": "avatar",
      "originalname": "profile.png",
      "mimetype": "image/png",
      "size": 204800,
      "buffer":[]
    },
    {
      "fieldname": "document",
      "originalname": "resume.pdf",
      "mimetype": "application/pdf",
      "size": 512000,
      "buffer":[]
    }
  ]
Features
  • Supports multiple file uploads.
  • Validates file types and sizes.
  • Ensures required files are uploaded.
  • Uses memory storage (files are not saved to disk).

This middleware simplifies file handling in Express, making it easy to manage uploads while enforcing validation rules.


Multipart File Upload Middleware

This middleware enables chunked file uploads in an Express application. It allows uploading large files by splitting them into smaller chunks, validating them, and merging them once all parts are received.

Usage

The request must be made using multipart/form-data. The uploaded chunks are processed in memory before being stored in temporary directories. Once all chunks are uploaded, they are merged into a single file.

Configuration Options

The following settings can be used when configuring file uploads:

Property Type Description Default
maxSize string Maximum allowed size for each chunk (e.g., "5MB") No limit
required boolean Whether the file is mandatory for the request false
Expected Request Format

The client must send the following fields in the multipart/form-data request:

Field Type Description
file File The chunked file data
filename String Name of the original file
uniqueID String Unique identifier for the upload session
chunkIndex Number Current chunk number (0-based index)
totalChunks Number Total number of chunks for the file
Middleware Output

The middleware adds a multipartFile object to the req object, which contains information about the uploaded file.

When all chunks are not yet received, req.multipartFile has the below JSON:
{
  "isPending": true,
  "originalname": "example.pdf",
  "uniqueID": "abc123",
  "chunkIndex": 2,
  "totalChunks": 5
}
When the upload is complete, req.multipartFile has the below JSON:
{
  "isPending": false,
  "originalname": "example.pdf",
  "uniqueID": "abc123",
  "filePath": "/uploads/completed_files/abc123_example.pdf"
}
Example
import { body } from 'express-validator';
import { Endpoint, middleware, AuthType, Method } from 'node-server-engine';

const fileConfig = { maxSize: '10MB', required: true };

new Endpoint({
  path: '/upload',
  method: Method.POST,
  authType: AuthType.JWT,
  multipartFile: fileConfig,
  handler: (req, res) => {
    console.log(req.multipartFile);
  }
});

Socket Client

The WebSocket server starts automatically if the Server's webSocket option is provided.

Each WebSocket connection creates a new SocketClient instance with built-in authentication, message routing, and connection management.

Features

  • JWT Authentication: Secure token-based authentication with automatic renewal
  • Message Routing: Type-based message handlers similar to HTTP endpoints
  • Connection Health: Built-in ping/pong mechanism with configurable intervals
  • Lifecycle Callbacks: Hooks for initialization, authentication, and shutdown
  • Error Handling: Automatic error formatting and client notification

Socket Client Options

Options can be set when configuring the Server's webSocket:

import { Server, MessageHandler } from 'node-server-engine';

const server = new Server({
  webSocket: {
    client: {
      handlers: [messageHandler1, messageHandler2],
      initCallbacks: (client) => {
        console.log(`Client ${client.id} connected`);
      },
      authCallbacks: (client) => {
        const user = client.getUser();
        console.log(`User ${user.userId} authenticated`);
      },
      shutdownCallbacks: (client) => {
        console.log(`Client ${client.id} disconnected`);
      }
    }
  }
});
Parameter Type Description Default
handlers Array<MessageHandler> A list of message handlers to use required
authCallbacks Function | Array<Function> Callbacks called when a client successfully authenticates []
initCallbacks Function | Array<Function> Callbacks called when the socket client is created []
shutdownCallbacks Function | Array<Function> Callbacks called when the socket client is destroyed []

Client Properties & Methods

Property/Method Type Description
id String Unique identifier for the connection
establishedAt Date Timestamp when connection was established
isAuthenticated() () => Boolean Check if client is authenticated
getUser() () => SocketUser | undefined Get authenticated user data (userId, deviceId, tokenId, audience)
sendMessage() (type, payload, options) Send a message to the client

Client Authentication

Clients authenticate by sending a message:

// Client-side
websocket.send(JSON.stringify({
  type: 'authenticate',
  payload: { token: 'your-jwt-token' }
}));

// Server will:
// 1. Verify the JWT token
// 2. Extract user information (userId, deviceId, audience)
// 3. Set authentication status
// 4. Trigger authCallbacks
// 5. Send renewal reminder 1 minute before token expiration

Environment Variables

Variable Description Default
WS_PING_INTERVAL Interval between ping checks (seconds) 30
WEBSOCKET_CLOSE_TIMEOUT Timeout for graceful close (milliseconds) 3000

Message Handler

Message handler are similar to Endpoint, but in a WebSocket context. They define how incoming messages should be handled.

import { MessageHandler, Server } from 'node-server-engine';

function handler(payload, client) {
  // payload is the message payload in a standard message.
  // client is an instance of SocketClient
}

const messageHandler = new MessageHandler(type, handler, options);

new Server({
  webSocket: { client: { handlers: [messageHandler] } }
});
Argument Type Description Default
type String The message type that should be handled required
handler Function A function that will run for every message of this type required
options.authenticated A flag indicating that this kind of message handling require an authenticated client true

Redis

The server engine exposes a Redis client configured for production use with automatic reconnection, retry logic, and optional TLS support.

It is a pre-configured instance of ioredis v5.8.2. See the package documentation for more details.

import { Redis } from 'node-server-engine';

// Initialize (automatically called by Server, or manually)
Redis.init();

// Use Redis commands
await Redis.set('key', 'value');
const value = await Redis.get('key');
await Redis.del('key');
await Redis.expire('key', 3600);

// Get underlying client for advanced use
const client = Redis.getClient();

// Shutdown when done
await Redis.shutdown();

Features

  • Automatic Reconnection: Exponential backoff retry strategy (up to 2s delay)
  • Error Recovery: Reconnects on READONLY, ECONNRESET, and ETIMEDOUT errors
  • Connection Management: Event listeners for connect, ready, error, close, reconnecting
  • TLS Support: Automatic TLS configuration when TLS_CA is provided
  • Test Mode: Lazy connection in test environments to prevent connection attempts

Environment Variables

Variable Description Default Required
REDIS_HOST Redis server hostname or IP -
REDIS_PORT Redis server port 6379
REDIS_USERNAME Username for authentication -
REDIS_PASSWORD Password for authentication -
TLS_CA TLS certificate authority for SSL/TLS -

Configuration Options

You can customize Redis client creation:

import { createRedisClient } from 'node-server-engine';

const client = createRedisClient({
  db: 2,                    // Database index (default: 0)
  lazyConnect: true,        // Don't connect until first command
  enableReadyCheck: false,  // Disable ready check
  redis: {                  // Override any ioredis options
    connectTimeout: 10000,
    commandTimeout: 5000
  }
});

SecretManager

The SecretManager entity provides seamless integration with GCP Secret Manager for secure credential management in production environments. It automatically loads secrets at startup, writes file-based secrets (like certificates) to secure temp locations, and falls back to environment variables in development.

import { SecretManager } from 'node-server-engine';

// Initialize (can be done automatically via Server configuration)
await SecretManager.init({
  enabled: process.env.NODE_ENV === 'production',
  projectId: process.env.GCP_PROJECT_ID,
  cache: true,
  fallbackToEnv: true,
  secrets: [
    'SQL_PASSWORD',          // Simple env variable
    'JWT_SECRET',
    {
      name: 'PRIVATE_KEY',   // File-based secret
      type: 'file',
      targetEnvVar: 'PRIVATE_KEY_PATH',
      filename: 'private-key.pem'
    }
  ]
});

// Get a cached secret value
const password = SecretManager.getSecret('SQL_PASSWORD');

// Fetch a secret on-demand (useful for rotation)
const apiKey = await SecretManager.fetchSecret('API_KEY');

// Reload all secrets
await SecretManager.reload();

// Check initialization status
if (SecretManager.isInitialized()) {
  console.log('Secrets loaded');
}

Features

  • Automatic Loading: Secrets loaded during server initialization
  • Environment Fallback: Uses process.env in development or when secrets are unavailable
  • File Support: Writes certificates and keys to temp files with secure permissions (0o600)
  • Caching: Optional caching of secret values for performance
  • Secret Rotation: On-demand fetching for runtime secret updates
  • Lifecycle Management: Automatic cleanup of temp files on shutdown

Server Integration

SecretManager can be configured directly in Server options:

import { Server, SecretManagerOptions } from 'node-server-engine';

const server = new Server({
  endpoints: [...],
  secretManager: {
    enabled: process.env.NODE_ENV === 'production',
    projectId: process.env.GCP_PROJECT_ID,
    cache: true,
    fallbackToEnv: true,
    secrets: [
      'SQL_PASSWORD',
      'JWT_SECRET',
      {
        name: 'PRIVATE_KEY',
        type: 'file',
        targetEnvVar: 'PRIVATE_KEY_PATH',
        filename: 'private-key.pem'
      }
    ]
  }
});

await server.init(); // Secrets loaded before app starts

Configuration Options

Property Type Description Default
enabled boolean Enable Secret Manager (typically only in production) false
projectId string GCP project ID Required
cache boolean Cache secret values in memory true
fallbackToEnv boolean Fall back to process.env if secret loading fails true
tempDir string Directory for file-based secrets os.tmpdir()
secrets Array<string | SecretConfig> List of secrets to load []

Secret Configuration

Secrets can be specified as strings (simple env variables) or objects for advanced configuration:

String format (simple env variable):

'SQL_PASSWORD'  // Loads GCP secret "SQL_PASSWORD" → process.env.SQL_PASSWORD

Object format (advanced configuration):

{
  name: 'PRIVATE_KEY',              // Secret name in GCP Secret Manager
  type: 'env' | 'file',             // Type: 'env' for variables, 'file' for certificates
  targetEnvVar: 'PRIVATE_KEY_PATH', // Env var name (optional for 'env' type)
  filename: 'private-key.pem',      // Filename for 'file' type (optional)
  version: 'latest'                 // Secret version (default: 'latest')
}

Secret Types

Environment Variable Secrets (type: 'env'):

  • Loaded directly into process.env
  • Good for passwords, API keys, tokens
  • Example: 'SQL_PASSWORD'process.env.SQL_PASSWORD

File-based Secrets (type: 'file'):

  • Written to temp files with secure permissions (0o600)
  • Good for certificates, private keys, JSON credentials
  • Environment variable points to file path
  • Example: PRIVATE_KEY/tmp/private-key.pemprocess.env.PRIVATE_KEY_PATH

Environment Variables

Variable Description Required
GCP_PROJECT_ID GCP project containing secrets
NODE_ENV Environment (production/development) -

Load Results

The init() and reload() methods return load statistics:

const result = await SecretManager.init({...});
console.log(result);
// {
//   loaded: 3,      // Successfully loaded from Secret Manager
//   failed: 0,      // Failed to load
//   fallback: 1,    // Used fallback environment variables
//   details: [...]  // Detailed information per secret
// }

Security Features

  • Secure File Permissions: File-based secrets written with 0o600 (owner read/write only)
  • Automatic Cleanup: Temp files deleted on server shutdown
  • No Logging: Secret values never logged (only metadata)
  • Lifecycle Integration: Registered with LifecycleController for proper cleanup

Example: Multiple Secrets

const secretConfig = {
  enabled: process.env.NODE_ENV === 'production',
  projectId: 'my-gcp-project',
  secrets: [
    // Database credentials
    'SQL_PASSWORD',
    'SQL_USER',
    
    // API keys
    'STRIPE_API_KEY',
    'SENDGRID_API_KEY',
    
    // Certificate files
    {
      name: 'TLS_CERT',
      type: 'file',
      targetEnvVar: 'TLS_CERT_PATH',
      filename: 'tls.crt'
    },
    {
      name: 'TLS_KEY',
      type: 'file',
      targetEnvVar: 'TLS_KEY_PATH',
      filename: 'tls.key'
    },
    
    // Service account key
    {
      name: 'GCP_SERVICE_ACCOUNT',
      type: 'file',
      targetEnvVar: 'GOOGLE_APPLICATION_CREDENTIALS',
      filename: 'service-account.json'
    }
  ]
};

await SecretManager.init(secretConfig);

AWSS3

The AWSS3 entity is a generic wrapper around AWS S3 (and S3-compatible services like MinIO, LocalStack). It supports uploads, downloads, streaming, metadata, and deletion with size validation and UUID-based path generation.

Based on @aws-sdk/client-s3 v3.

import { AWSS3 } from 'node-server-engine';
import fs from 'fs';

// Initialize with configuration (optional; otherwise uses environment variables)
AWSS3.init({
  region: 'us-east-1',
  accessKeyId: 'YOUR_ACCESS_KEY',
  secretAccessKey: 'YOUR_SECRET_KEY',
  // For S3-compatible services
  // endpoint: 'http://localhost:4566',
  // forcePathStyle: true
});

// Or rely on environment variables and auto-init
// AWSS3.init();

// Upload a file
const fileStream = fs.createReadStream('photo.jpg');
const uploaded = await AWSS3.upload(
  fileStream,
  'my-bucket',
  { directory: 'photos', mime: 'image/jpeg' },
  { maxSize: '5MB' },
  { userId: '123' } // optional metadata
);
console.log(uploaded.Key); // photos/<uuid>.jpeg

// Download entire file into memory
const { data, metadata } = await AWSS3.get('my-bucket', uploaded.Key);
console.log(metadata.ContentLength);

// Stream a large file
const { stream } = await AWSS3.download('my-bucket', uploaded.Key);
stream.pipe(res);

// Get a fast stream without metadata
const s = await AWSS3.getFileStream('my-bucket', uploaded.Key);
s.pipe(res);

// Metadata only
const head = await AWSS3.getMetadata('my-bucket', uploaded.Key);

// Delete
await AWSS3.delete('my-bucket', uploaded.Key);

// Generate unique destination path
const key = AWSS3.generateFileDestination({ directory: 'uploads/images', mime: 'image/png' });

Features

  • Auto-initialization via environment variables
  • Stream-based upload with maxSize validation
  • Full download, streaming download, or stream-only access
  • UUID-based file naming with directory and extension inference from MIME
  • S3-compatible (supports custom endpoint and forcePathStyle)

API Methods

init(config?)

Initialize the S3 client explicitly; otherwise it will lazy-init using environment variables.

AWSS3.init({
  region: 'us-east-1',
  accessKeyId: '...',
  secretAccessKey: '...',
  sessionToken: '...', // optional
  endpoint: 'http://localhost:9000', // optional (MinIO/LocalStack)
  forcePathStyle: true // optional (S3-compatible)
});
upload(stream, bucket, destinationOptions?, uploaderOptions?, metadata?)

Uploads content from a readable stream.

  • destinationOptions: { directory?, fileName?, mime?, noExtension? }
  • uploaderOptions: { maxSize?: string } e.g. 5MB, 100KB
  • metadata: key/value pairs stored as object metadata

Returns Promise<S3UploadedFile> with keys like Bucket, Key, ETag, Location.

get(bucket, key)

Downloads the full file into memory as Buffer and returns { data, metadata }.

download(bucket, key)

Returns { stream, metadata } for streaming large files efficiently.

getFileStream(bucket, key)

Returns a Readable stream without fetching metadata.

getMetadata(bucket, key)

Returns file metadata via a HEAD request.

delete(bucket, key)

Deletes the object.

generateFileDestination(options?)

Generates a unique key using UUID with optional directory and extension (from mime).

Environment Variables

Variable Description Required
AWS_REGION AWS region (e.g., us-east-1) ✗*
AWS_ACCESS_KEY_ID Access key ID ✗*
AWS_SECRET_ACCESS_KEY Secret access key ✗*
AWS_SESSION_TOKEN Session token (temporary creds)
AWS_S3_ENDPOINT Custom S3-compatible endpoint (MinIO/LocalStack)
AWS_S3_FORCE_PATH_STYLE Use path-style addressing (true/false)

* Not required if you call init() with a config object.

Error Handling

  • Throws WebError with status 413 when maxSize is exceeded during upload
  • Other errors bubble up from AWS SDK v3 commands

Example:

try {
  await AWSS3.upload(stream, 'bucket', {}, { maxSize: '1MB' });
} catch (e) {
  if (e.statusCode === 413) {
    console.log('File too large');
  }
}

Usage in Template Projects

import { AWSS3, Endpoint, Method } from 'node-server-engine';
import { Readable } from 'stream';

new Endpoint({
  path: '/upload-s3',
  method: Method.POST,
  files: [{ key: 'file', maxSize: '10MB', required: true }],
  async handler(req, res) {
    const file = req.files[0];
    const stream = Readable.from(file.buffer);

    const result = await AWSS3.upload(
      stream,
      process.env.S3_UPLOAD_BUCKET,
      { directory: 'user-uploads', mime: file.mimetype }
    );

    res.json({ key: result.Key, url: result.Location });
  }
});

Sequelize

The server engine exposes an SQL ORM that is configured to work with the standard environment variables that are used in our services.

It is a pre-configured instance of sequelize. See the package documentation for more details.

import { Sequelize } from 'node-server-engine';

Sequelize.sequelize;
Sequelize.closeSequelize();

It can be configured through environment variables

env description default
SQL_HOST Host to which connect to
SQL_PORT Port on which SQL is served 5432
SQL_PASSWORD Password used to authenticate with the SQL server
SQL_USER User used to authenticate with the SQL server
SQL_DB Database to which connect to
SQL_TYPE SQL type which connect to postgres

Pub/Sub

The engine exposes a PubSub entity that can be used to communicate with Google Cloud Pub/Sub with production-ready configuration including flow control, retry policies, and batch processing.

Based on @google-cloud/pubsub v4.11.0.

import { PubSub } from 'node-server-engine';

/**
 * Declare that the service will be publishing to a topic
 * This must be done before init() is called
 * Any attempt to publish a message without declaring a topic first will fail
 * @param {string|Array<string>} topic - The topic(s) to which we will be publishing
 * @param {Object} [options] - Publisher configuration options
 */
PubSub.addPublisher(topic, {
  enableMessageOrdering: false, // Enable ordering (requires orderingKey when publishing)
  batching: {
    maxMessages: 100,        // Max messages per batch
    maxBytes: 1024 * 1024,   // Max bytes per batch (1MB)
    maxMilliseconds: 100     // Max delay before sending batch
  },
  retry: {
    initialDelayMillis: 100,  // Initial retry delay
    maxDelayMillis: 60000,    // Max retry delay (60s)
    delayMultiplier: 1.3      // Exponential backoff multiplier
  }
});

/**
 * Binds a message handler to a subscription
 * If called multiple times, handlers will be chained
 * This must be done before init() is called
 * The subscription will not be consumed until init() is called
 * @param {string} subscription - The subscription to consume
 * @param {function|Array<function>} handler - The message handling function(s)
 * @param {Object} [options] - Subscriber configuration options
 */
PubSub.addSubscriber(subscription, handler, {
  first: false,              // Put handler at beginning of chain
  isDebezium: false,         // Handle Debezium CDC events
  ackDeadline: 60,           // Acknowledgement deadline (10-600 seconds)
  flowControl: {
    maxMessages: 1000,         // Max concurrent messages
    maxBytes: 100 * 1024 * 1024, // Max concurrent bytes (100MB)
    allowExcessMessages: true  // Allow excess messages if under maxBytes
  }
});

/**
 * Establish connection with all the declared publishers/subscribers
 * Validates topic/subscription existence and permissions
 */
await PubSub.init();

/**
 * Send a message through a previously declared publisher
 * @param {string} topic - The name of the topic to which the message should be pushed
 * @param {Object} message - The actual message (will be JSON stringified)
 * @param {Object} [attributes] - Message attributes for filtering
 * @param {string} [orderingKey] - Enforce ordering for messages with same key
 */
await PubSub.publish(topic, message, attributes, orderingKey);

/**
 * Flush all pending messages and close connections with Pub/Sub
 */
await PubSub.shutdown();

Message Handling

Messages are acknowledged after successful processing. If any handler throws an error, the message is nacked and will be redelivered according to the subscription's retry policy.

// Handler signature
async function messageHandler(payload, attributes, publishedAt) {
  // payload: The JSON message content
  // attributes: Message attributes (key-value pairs)
  // publishedAt: Date when message was published
  
  // Process the message
  await processData(payload);
  
  // Message will be ack'd automatically after successful processing
  // If an error is thrown, message will be nack'd for redelivery
}

Best Practices

  1. Flow Control: Configure flowControl based on your service's memory and processing capacity
  2. Batch Settings: Tune batching for optimal throughput vs. latency tradeoff
  3. Retry Policy: Use exponential backoff to handle transient failures gracefully
  4. Dead Letter Topics: Configure dead letter topics on your subscriptions in GCP Console for failed messages
  5. Exactly-Once Delivery: Enable exactly-once delivery in GCP Console when creating/updating your subscription (requires ackWithResponse() in handlers)
  6. Message Ordering: Only enable when strict ordering is required (reduces throughput)
  7. ACK Deadline: Set based on your handler's processing time (default: 60s)

Environment Variables

Pub/Sub authentication uses Google Cloud Application Default Credentials. Set GOOGLE_APPLICATION_CREDENTIALS to your service account key path, or use Workload Identity in GKE.


PushNotification

Communication interface with the push notification service using Pub/Sub.

import { PushNotification } from 'node-server-engine';

// Initialize with optimized settings for push notifications
// Must be called before sending notifications
await PushNotification.init();

/**
 * Send a push notification through the push service
 * @param {string} userId - ID of the user that should receive the notification
 * @param {Object} notification - Notification that should be sent
 * @throws {EngineError} If userId is missing or publish fails
 */
await PushNotification.sendPush(userId, {
  title: 'Hello!',
  body: 'You have a new message',
  payload: { type: 'message', messageId: '123' },
  priority: true,
  ttl: 3600 // Time to live in seconds
});

Notification Options

{
  title?: string;              // Notification title
  body?: string;               // Notification body text
  payload?: unknown;           // Custom data payload
  voip?: boolean;             // VOIP notification (iOS)
  background?: boolean;       // Background/data-only notification
  token?: string;             // Specific device token (optional)
  mutable?: boolean;          // Content can be mutated by client (iOS)
  contentAvailable?: boolean; // Requires client-side processing (iOS)
  ttl?: number;               // Time to live in seconds
  priority?: boolean;         // High priority notification
  collapseId?: string;        // Group notifications with same ID
}

Environment Variables

Variable Description Required
PUBSUB_PUSH_NOTIFICATION_QUEUE_TOPIC Pub/Sub topic name for push notifications

Configuration

The PushNotification entity is pre-configured with optimal settings:

  • Batching: Up to 100 messages, 1MB max, 50ms delay for fast delivery
  • Retry: Exponential backoff with 100ms initial delay, up to 60s max delay
  • No Ordering: Push notifications don't require strict ordering for better throughput

Integration

Your omg-notification-service (or similar) should subscribe to the configured topic to process and deliver push notifications to devices via FCM, APNs, or other push providers.


Localizator

The localizator exposes localization related utilities.

import { Localizator } from 'node-server-engine';

// The synchronize init should be called first to initialize the data
// The localizator will regularly synchronize new data after that without any calls having to be made
await Localizator.init();

// Get the different ways of displaying a user's name
const { regular, formal, casual } = await Localizator.getDisplayNames(
  firstName,
  lastName,
  locale
);

// This should be call when the program shuts down.
await Localizator.shutdown();
env description default
LOCALES_URL Base URL where the locales data is stored required

ElasticSearch

The ElasticSearch entity provides a managed client for Elasticsearch with automatic migrations, connection management, and TLS support.

Based on @elastic/elasticsearch v9.2.0.

import { ElasticSearch } from 'node-server-engine';

// Initialize (runs migrations automatically)
await ElasticSearch.init();

// Get the client for operations
const client = ElasticSearch.getClient();

// Use Elasticsearch operations
await client.index({
  index: 'products',
  id: '123',
  document: { name: 'Product', price: 99.99 }
});

const result = await client.search({
  index: 'products',
  query: { match: { name: 'Product' } }
});

// Shutdown when done
await ElasticSearch.shutdown();

Features

  • Automatic Migrations: Tracks and runs migrations on startup
  • Connection Verification: Pings cluster on initialization
  • TLS Support: Optional SSL/TLS configuration
  • Retry Logic: Built-in retry mechanism with configurable attempts
  • Test Mode: Auto-flushes indices in test environment
  • Error Handling: Detailed error reporting with context

Migration System

Create migration files in your specified migration directory:

// migrations/001-create-products-index.ts
import { Client } from '@elastic/elasticsearch';

export async function migrate(client: Client): Promise<void> {
  await client.indices.create({
    index: 'products',
    settings: {
      number_of_shards: 1,
      number_of_replicas: 1
    },
    mappings: {
      properties: {
        name: { type: 'text' },
        price: { type: 'double' },
        createdAt: { type: 'date' }
      }
    }
  });
}

Migrations are:

  • Run automatically on init()
  • Tracked in the migrations index
  • Executed once per file
  • Run in alphabetical order

Environment Variables

Variable Description Required
ELASTIC_SEARCH_HOST Elasticsearch cluster URL
ELASTIC_SEARCH_USERNAME Username for authentication
ELASTIC_SEARCH_PASSWORD Password for authentication
ELASTIC_SEARCH_MIGRATION_PATH Absolute path to migrations directory
TLS_CA TLS certificate authority for SSL/TLS

Configuration

The client is configured with:

  • Max Retries: 3 attempts
  • Request Timeout: 30 seconds
  • Sniff on Start: Disabled (use with clusters)
  • TLS: Enabled when TLS_CA is provided

GoogleCloudStorage

The GoogleCloudStorage entity provides a simple, generic wrapper for Google Cloud Storage operations. It handles file uploads, downloads, streaming, and deletion with built-in size validation and automatic file path generation.

Based on @google-cloud/storage v7.14.0.

import { GoogleCloudStorage } from 'node-server-engine';
import fs from 'fs';

// Initialize with configuration
GoogleCloudStorage.init({
  projectId: 'my-project',
  keyFilename: '/path/to/keyfile.json'
});

// Or use environment variables (GC_PROJECT, GOOGLE_APPLICATION_CREDENTIALS)
GoogleCloudStorage.init();

// Upload a file
const fileStream = fs.createReadStream('photo.jpg');
const result = await GoogleCloudStorage.upload(
  fileStream,
  'my-bucket',
  { directory: 'photos', mime: 'image/jpeg' },
  { metadata: { contentType: 'image/jpeg' } },
  { maxSize: '5MB' }
);
console.log(result.name); // photos/uuid.jpeg

// Download file as Buffer
const { data, metadata } = await GoogleCloudStorage.get('my-bucket', 'photos/image.jpg');
console.log(metadata.size); // File size in bytes
fs.writeFileSync('downloaded.jpg', data);

// Stream a file (for large files)
const { stream, metadata } = await GoogleCloudStorage.download('my-bucket', 'videos/video.mp4');
stream.pipe(res); // Stream to HTTP response

// Get file stream directly (fastest, no metadata)
const stream = GoogleCloudStorage.getFileStream('my-bucket', 'audio/song.mp3');
stream.pipe(res);

// Delete a file
await GoogleCloudStorage.delete('my-bucket', 'temp/old-file.txt');

// Generate unique file paths
const path = GoogleCloudStorage.generateFileDestination({
  directory: 'uploads/images',
  mime: 'image/jpeg'
});
// Result: 'uploads/images/a1b2c3d4-e5f6-7890-abcd-ef1234567890.jpeg'

Features

  • Auto-initialization: Automatically initializes on first use if not manually initialized
  • Flexible Configuration: Supports config object, environment variables, or credentials
  • File Upload: Stream-based uploads with size validation
  • Multiple Download Methods: Full download, streaming, or direct stream access
  • Path Generation: Automatic UUID-based file naming with directory and extension support
  • Size Validation: Built-in file size limits with human-readable formats (5MB, 100KB, etc.)
  • Error Handling: Detailed error reporting with context
  • No Project Dependencies: Generic implementation works with any Google Cloud Storage bucket

API Methods

init(config?)

Initialize Google Cloud Storage with configuration. Optional - will auto-initialize with environment variables if not called.

GoogleCloudStorage.init({
  projectId: 'my-project-id',
  keyFilename: '/path/to/service-account-key.json',
  // Or use credentials directly
  credentials: {
    client_email: 'service@project.iam.gserviceaccount.com',
    private_key: '-----BEGIN PRIVATE KEY-----\n...'
  },
  // For local emulator
  apiEndpoint: 'http://localhost:9000'
});
upload(stream, bucket, destinationOptions?, storageOptions?, uploaderOptions?)

Upload a file to Google Cloud Storage.

Parameters:

  • stream (Readable): Node.js readable stream of file content
  • bucket (string): Bucket name
  • destinationOptions (object, optional):
    • directory (string): Subdirectory path (e.g., 'uploads/images')
    • fileName (string): Specific filename (if not provided, generates UUID)
    • mime (string): MIME type for extension detection
    • noExtension (boolean): Don't append file extension
  • storageOptions (object, optional): Google Cloud Storage write stream options
  • uploaderOptions (object, optional):
    • maxSize (string): Maximum file size (e.g., '5MB', '100KB', '1GB')

Returns: Promise<StorageUploadedFile> - Metadata of uploaded file

Example:

const fileStream = fs.createReadStream('document.pdf');
const result = await GoogleCloudStorage.upload(
  fileStream,
  'documents-bucket',
  { directory: 'legal/contracts', mime: 'application/pdf' },
  { metadata: { contentType: 'application/pdf' } },
  { maxSize: '10MB' }
);
get(bucket, path)

Download a file and return its content as a Buffer along with metadata.

Parameters:

  • bucket (string): Bucket name
  • path (string): File path in the bucket

Returns: Promise<{data: Buffer, metadata: StorageUploadedFile}>

Example:

const { data, metadata } = await GoogleCloudStorage.get('my-bucket', 'photos/image.jpg');
console.log(metadata.contentType); // 'image/jpeg'
console.log(data.length); // File size in bytes
download(bucket, path)

Get a readable stream for a file along with its metadata. Use this for large files or when you need to stream content.

Parameters:

  • bucket (string): Bucket name
  • path (string): File path in the bucket

Returns: Promise<{stream: Readable, metadata: StorageUploadedFile}>

Example:

const { stream, metadata } = await GoogleCloudStorage.download('my-bucket', 'videos/large-video.mp4');
console.log(metadata.size); // File size
stream.pipe(response); // Stream to HTTP response
getFileStream(bucket, path)

Get a readable stream for a file without fetching metadata. Fastest option when metadata is not needed.

Parameters:

  • bucket (string): Bucket name
  • path (string): File path in the bucket

Returns: Readable - Node.js readable stream

Example:

const stream = GoogleCloudStorage.getFileStream('my-bucket', 'audio/song.mp3');
stream.pipe(response); // Direct streaming
delete(bucket, path)

Delete a file from Google Cloud Storage.

Parameters:

  • bucket (string): Bucket name
  • path (string): File path in the bucket

Returns: Promise<void>

Example:

await GoogleCloudStorage.delete('my-bucket', 'temp/old-file.txt');
generateFileDestination(options?)

Generate a unique file path with optional directory and extension.

Parameters:

  • options (object, optional):
    • directory (string): Subdirectory path
    • mime (string): MIME type for extension detection
    • noExtension (boolean): Don't append extension

Returns: string - Generated file path

Examples:

// UUID only
GoogleCloudStorage.generateFileDestination();
// → 'a1b2c3d4-e5f6-7890-abcd-ef1234567890'

// With directory and MIME type
GoogleCloudStorage.generateFileDestination({
  directory: 'uploads/images',
  mime: 'image/jpeg'
});
// → 'uploads/images/a1b2c3d4-e5f6-7890-abcd-ef1234567890.jpeg'

// Without extension
GoogleCloudStorage.generateFileDestination({
  directory: 'data',
  noExtension: true
});
// → 'data/a1b2c3d4-e5f6-7890-abcd-ef1234567890'

Environment Variables

Variable Description Required
GC_PROJECT Google Cloud Project ID ✗*
GOOGLE_APPLICATION_CREDENTIALS Path to service account key file ✗*

* Not required if you call init() with a config object

Error Handling

The entity throws WebError with appropriate status codes:

  • 413 (Payload Too Large): File exceeds maxSize limit
  • Other errors are passed through from Google Cloud Storage SDK

Example:

try {
  await GoogleCloudStorage.upload(stream, 'bucket', {}, {}, { maxSize: '1MB' });
} catch (error) {
  if (error.statusCode === 413) {
    console.log('File too large!');
  }
}

Usage in Template Projects

In your node-server-template endpoints:

import { GoogleCloudStorage } from 'node-server-engine';
import { Endpoint, Method } from 'node-server-engine';

new Endpoint({
  path: '/upload',
  method: Method.POST,
  files: [{ key: 'file', maxSize: '5MB', required: true }],
  async handler(req, res) {
    const file = req.files[0];
    const stream = Readable.from(file.buffer);
    
    const result = await GoogleCloudStorage.upload(
      stream,
      process.env.UPLOAD_BUCKET,
      { directory: 'user-uploads', mime: file.mimetype }
    );
    
    res.json({ path: result.name, url: result.mediaLink });
  }
});

Translation Manager

The translation manager exposes translation related utilities.

import { TranslationManager } from 'node-server-engine';

// The synchronize init should be called first to initialize the data
// The tranlsation manager will regularly synchronize new data after that without any calls having to be made
await TranslationManager.init();

// Get the different ways of displaying a user's name
const translatedString = await TranslationManager.translate(
  lang,
  key,
  variables,
  tags
);

// Example
const translatedString = await TranslationManager.translate(
  'zh-TW',
  'email.invitation.body',
  { name: 'John' },
  { link: ['a', 'href="https://www.test.com"'] }
);

// This should be call when the program shuts down.
await TranslationManager.shutdown();
  • lang [String]: Locale for which the translation should be fetched (if no data is found, translation will be returned in en-US).
  • key [String]: Translation key
  • variables [Object]: A key=>value mapping for variable interpolation in strings. (Optional)
  • tags [Object]: A key=>value mapping for variable interpolation in strings. (Optional)
env description default
LOCALES_URL Base URL where the locales data is stored required

Error Reporting

The server engine standardizes the way errors are handled and reported. The error classes provided by the Server Engine should always be used when throwing an exception.

Errors are a crucial part of the application, they are what helps us to properly debug the program and offer support when need, as well as what exposes issues to the client.

Log Output Formats

The engine automatically adapts log output based on the environment:

Local Development (Readable Format)

  • Colorized output with severity levels
  • Formatted timestamps and file locations
  • Pretty-printed data objects
  • Stack traces with proper formatting
  • HTTP request context when available

Production/GCP (JSON Format)

  • Structured JSON for log aggregation
  • Google Cloud Error Reporting integration
  • Kubernetes pod information
  • Service context metadata

Control Log Format

You can override the automatic detection:

# Force readable local format (useful for local Docker)
LOG_FORMAT=local npm start

# Force JSON format (useful for local testing)
LOG_FORMAT=json npm start

Example Local Output:

[2025-12-10T10:30:45.123Z] ERROR    src/endpoints/users.ts:42 User not found
  Error Code: user-not-found
  Status: 404
Data:
  {
    "userId": "abc-123",
    "requestId": "req-456"
  }

By standard, the client receives the following body when an error happens.

// HTTP Status code (400 | 500)
{
  errorCode: "some-error", // Machine readable error code
  hint: "The selected user does not exist" // (optional) Hint for developers
}

Status codes should be limited to 400 for client errors and 500 for server errors. Other 4XX status codes should be avoided unless very specific cases (ex: authentication error).

All our custom error classes take a data parameter. This will be logged on the backend and should contain any data that can help to understand the runtime context or the error (ex: user's ID).

Common options

These options are common to each of the error classes described below.

option definition example default
message A message logged on the backend only "Could not find user in the DB" required
severity The level at which this error should be logged Severity.WARNING Severity.CRITICAL
data Some data related to the error that will be logged on the backend. This should help to undetrstand the runtime context. {userId: 'xf563ugh0'}
error An error object, when this is a wrapper around an existing error object

Severity

Severity allows us to order errors by their impact on the program. It is important to set severity correctly as backend logs can include hundreds of entries per seconds, severity allows us to filter out the most important errors. An enumeration is exposed that includes all the severity levels, as described in the following table.

Log Levels
severity definition
DEBUG Detailed information of the runtime execution. Used for debugging.
INFO Base information level of runtime information.
WARNING Errors that are expected to happen and do not cause any particular issue. (ex: a client made an unauthenticated request)
CRITICAL Errors that are unexpected and cause an improper behavior. (ex: failed to store some data in the DB)
EMERGENCY Errors that prevent the program from running. (ex: some environment variables are not correctly set)

EngineError

This error class represents errors that happen withing the Server Engine. They should be used for server configuration errors, or unexpected behaviors. They will always return 500 - {errorCode: 'server-error'}.

import { EngineError, Severity } from 'node-server-engine';

if (!process.env.IMPORTANT_VARIABLE) {
  throw new EngineError(options);
}

Engine Errors strictly follow the common options.

WebError

This error class represents errors that happen at runtime and that need specific reporting to the client. Their definition is more complex, but it includes additional data specific to the client.

import { WebError, Severity } from 'node-server-engine';

const user = User.findOne({ where: { id: userId } });
if (!user) {
  throw new WebError(options);
}

In addition to the common options, WebError defines some other options specific for error reporting to clients.

option definition example default
errorCode A machine readable error code that will be parsed by the client. unknown-user required
statusCode HTTP status code of the response. 400 500
hint A human readable error message that is intended for developers

Middlewares

Server engine exposes a bunch of middlewares. These can be imported to your project and used globally or per endpoint.

Swagger Docs

This middleware allows the service to connect with the API Documentation Service. It exposes the necessary endpoint for the documentation of this API to be visible by the Documentation Service.

import { Server, middleware } from 'node-server-engine';

new Server({
  globalMiddleware: [middleware.swaggerDocs()]
});

Structuring Documentation

The underlying system used is the Open API spec. Most of the data is already generate by the documentation service. The only real need is to document endpoints.

Endpoints should be documented in YAML files that respect the **/*.docs.yaml convention. It is recommended to place them in the same directory as the endpoint definition. Endpoint documentation has to follow the path object spec from Open API.

/hello:
  get:
    description: Request the API to say Hello
    responses:
      '200':
        description: Replies hello
        content:
          application/json:
            schema:
              type: Object
              properties:
                says:
                  type: string
                  example: Hello

Schemas and Responses

To avoid repeating the same structure across the documentation manually, one can use schemas and responses components.

Some common components are already defined directly in the API Documentation Service, please check its documentation to avoid repeats.

If you ever need to declare custom components, they simply must follow the architecture bellow.

# Repository root
/src
  /docs
    /responses
      - coolResponse.yaml
    / schemas
      - bestSchema.yaml

Here is an example definition:

Dog:
  type: object
  properties:
    name:
      type: string
      example: Rex
    owner:
      type: string
      example: f1982af0-1579-4c56-a138-de1ab4ff39b3
    isAGoodBoy:
      type: boolean
      example: true
  required:
    - name
    - owner

User Resolver

/!\ Must be used in combination with AuthType.JWT

Resolves the user making the request with the user resolver. The user's complete data is added to req.user.

import { Endpoint, middleware, AuthType, Method } from 'node-server-engine';

new Endpoint({
  path: '/hello',
  method: Method.GET,
  authType: AuthType.JWT,
  middleware: [middleware.userResolver],
  handler: (req, res) => {
    res.json({ say: `Hello ${req.user.firstName}` });
  }
});

Gemini File Upload

An endpoint can upload a file to a google gemini AI

The request must be made as multipart/form-data.

File should be uploaded under the key file.

The file's data will be available at req.body.fileUri req.body.mimeType req.body.originalname


Check Permission Middleware

⚠️ Must be used in combination with AuthType.JWT

Role-based access control middleware that checks if the authenticated user has at least one of the required permissions. All permission checks are case-insensitive for maximum flexibility.

Features

  • ✅ Single or multiple permission checking
  • ✅ Case-insensitive permission matching
  • ✅ Integration with JWT authentication
  • ✅ Clear error messages for debugging
  • ✅ TypeScript support

Basic Usage

import { Endpoint, middleware, AuthType, Method } from 'node-server-engine';

// Single permission check
new Endpoint({
  path: '/users',
  method: Method.GET,
  authType: AuthType.JWT,
  middleware: [middleware.checkPermission('users:read')],
  handler: (req, res) => {
    res.json({ message: 'User list' });
  }
});

// Multiple permissions (user needs at least ONE)
new Endpoint({
  path: '/admin',
  method: Method.GET,
  authType: AuthType.JWT,
  middleware: [middleware.checkPermission(['admin', 'superuser', 'moderator'])],
  handler: (req, res) => {
    res.json({ message: 'Admin access granted' });
  }
});

User Object Structure

The middleware expects req.user to contain a permissions array:

interface User {
  id: string;
  permissions: string[]; // e.g., ['users:read', 'users:write', 'admin']
}

Examples

// Case-insensitive matching
checkPermission('ADMIN')  // Matches: 'admin', 'Admin', 'ADMIN'
checkPermission(['READ', 'write'])  // Matches any case variation

// Namespace-style permissions
checkPermission('users:read')
checkPermission(['users:write', 'users:delete'])

// Role-based permissions
checkPermission(['admin', 'moderator'])

Response Codes

  • 200: Permission granted, request proceeds
  • 403: Permission denied
    • No user authenticated
    • User has no permissions array
    • User lacks required permission(s)

Error Responses

{
  "message": "User does not have permissions"
}
{
  "message": "Permission denied"
}

};


---

### Verification Token Middleware

⚠️ **Must be used in combination with AuthType.JWT**

Generates a short-lived verification flow for sensitive operations (for example: delete account, change email, payout). The middleware verifies a signed verification token and the OTP provided by the user, and binds it to a specific action.

#### Flow Summary

1. Generate a verification token + OTP using the utility (server-side).
2. Send the OTP to the user (SMS/WhatsApp/email).
3. Call the sensitive endpoint with both `x-verification-token` and `x-verification-otp`.

#### Middleware Usage

```javascript
import { Endpoint, middleware, AuthType, Method } from 'node-server-engine';

new Endpoint({
  path: '/users/:id',
  method: Method.DELETE,
  authType: AuthType.JWT,
  middleware: [middleware.verificationToken('DELETE ACCOUNT', { requireSubject: true })],
  handler: (req, res) => {
    res.json({ ok: true });
  }
});

Defaults

  • Token header: x-verification-token
  • OTP header: x-verification-otp
  • Body/query fields: verificationToken, verificationOtp

Utilities

The server engine ships with a handful of utility functions that are commonly used by servers

Request

This function is a wrapper around axios. It adds proper error handling for reporting in the logs when a network call fails. It should be used for any requests made by a service. Refer to the axios documentation for the request configuration.

import { request } from 'node-server-engine';
const { data } = await request({
  method: 'get',
  url: 'https://www.google.com'
});

TLS Request

This function is a wrapper around request. It adds the necessary configuration to easily make requests with TLS. The settings used are based on the environment variables. It will use request specific certificate/key if defined, or will fallback to use the same ones as the ones used for the server.

It is necessary to use this function when calling other services in the cluster. Requests could fail otherwise as the common CA is not set and the client certificate not exposed.

import { tlsRequest } from 'node-server-engine';
const { data } = await tlsRequest({
  method: 'get',
  url: 'https://www.google.com'
});

TLS Config

The server's TLS configuration can be fetched as an object. Alternatively, the server engine also exposes an HTTPS Agent. They will not be available at startup, so it is important that the function calling them loads them first if they are not present.

import {tlsConfig, httpsAgent, loadTlsConfig} from 'node-server-engine';
import https from 'https';

// TLS Config
if(!tlsConfig) loadTlsConfig();
const customAgent = new https.Agent({
  key: tlsConfig.key,
  cert: tlsConfig.cert,
  ca: tlsConfig.ca,
  passphrase: tlsConfig.passphrase,
});

// HTTPS Agent
if(!httpsAgent) loadTlsConfig();
https.request('https://www.google.com'. {agent: httpsAgent});

Send Push Notification

Send a push notification through the push service.

import { sendPush } from 'node-server-engine';
await sendPush(userId, notification);
parameter description
userId ID of the user that should receive the notification
notification Content of the notification as defined by the push service documentation

Send Email

Send an email using the SMTP configuration.

import { sendEmail } from 'node-server-engine';

const emailOptions = {
  to: 'recipient@example.com',
  subject: 'Welcome to our service!',
  text: 'Hello, welcome to our platform!',
  html: '<h1>Welcome</h1><p>Glad to have you onboard!</p>',
  attachments: [
    {
      filename: 'welcome.txt',
      content: 'Welcome to our service!'
    }
  ]
};

const result = await sendEmail(emailOptions);
console.log(result.status); // 'sent', 'delivered', 'queued', or 'failed'

Parameters

Parameter Description
from (Optional) Sender's email address. Defaults to the authenticated email.
to Email recipient(s) as a string or array of strings.
cc (Optional) Carbon Copy recipients (string or array).
bcc (Optional) Blind Carbon Copy recipients (string or array).
subject Subject of the email.
text (Optional) Plain text version of the email body.
html (Optional) HTML version of the email body.
attachments (Optional) Array of attachments. Each attachment can include filename, content, path, and other properties.
replyTo (Optional) Email address for replies.
headers (Optional) Custom email headers.
priority (Optional) Email priority (high, normal, or low).

Return Status

The function returns an object with the following status options:

Status Description
sent Email was successfully sent but delivery confirmation is not available.
delivered Email was successfully delivered.
queued Email is queued for delivery but not yet sent.
failed Email could not be sent due to an error.

Verification Token

Generate and verify a short-lived verification token + OTP for sensitive operations.

import { createVerificationToken, verifyVerificationToken } from 'node-server-engine';

// Generate token + OTP
const { token, otp, expiresAt } = createVerificationToken({
  action: 'DELETE ACCOUNT',
  subject: userId,
  otpLength: 6,
  expiresInSeconds: 300
});

// Later, verify token + OTP
const payload = verifyVerificationToken(token, {
  action: 'DELETE ACCOUNT',
  otp,
  subject: userId
});

Environment Variables

  • VERIFICATION_TOKEN_SECRET (required)
  • VERIFICATION_TOKEN_OTP_SECRET (optional, defaults to VERIFICATION_TOKEN_SECRET)
  • VERIFICATION_TOKEN_ISSUER (optional, default: node-server-engine)
  • VERIFICATION_TOKEN_AUDIENCE (optional, used when audience is not provided)

Multi-service setup

If you generate verification tokens in one service and verify them in another, all services must share the same verification secrets for the same environment:

  • Use the same VERIFICATION_TOKEN_SECRET in every service.
  • If you set VERIFICATION_TOKEN_OTP_SECRET, use the same value everywhere.
  • Keep VERIFICATION_TOKEN_ISSUER and VERIFICATION_TOKEN_AUDIENCE consistent across services.

Gemini File Upload

Upload a file to Google Gemini AI.

import { geminiFileUpload } from 'node-server-engine';

const fileBuffer = fs.readFileSync('example.pdf');
const mimeType = 'application/pdf';
const originalName = 'example.pdf';

const result = await geminiFileUpload(fileBuffer, mimeType, originalName);

if (result.success) {
  console.log('File uploaded successfully:', result.fileUri);
} else {
  console.error('File upload failed:', result.error);
}

Parameters

Parameter Type Description
buffer Buffer The file content in buffer format.
mimeType string The MIME type of the file (e.g., image/png, application/pdf).
originalName string The original filename, including the extension.

Response

The function returns an object with one of the following structures:

Success Response

{
  "success": true,
  "originalname": "example.pdf",
  "fileUri": "https://gemini.googleapis.com/file/xyz123",
  "mimeType": "application/pdf"
}

Failure Response

{
  "success": false,
  "error": "Error message"
}

Return Fields

Field Type Description
success boolean Indicates whether the upload was successful.
originalname string The name of the uploaded file.
fileUri string The URI of the uploaded file on Google Gemini AI.
mimeType string The MIME type of the uploaded file.
error any Present only if success is false. Contains the error details.

Error Handling

  • If the GOOGLE_AI_KEY environment variable is missing, the function throws an error.
  • If the upload fails or the file processing does not complete successfully, an error response is returned.
  • Temporary files are cleaned up after the upload process to prevent storage issues.

Filter

Apply a filter on an object. It returns a copy of the object that only holds the whitelisted keys.

This is particularly useful to sanitize objects before returning them to clients.

import { filter } from 'node-server-engine';

const object = { a: 'kept', b: 'not kept' };
const whitelist = ['a'];

const result = filter(object.whitelist);
// result = {a: 'kept'}

Database Migration

Some utilities are exposed to handle database migrations.

runPendingMigrations will execute all the migration scripts that have not yet been executed.

rollbackMigrations will rollback all the migrations that have been executed with the current version of the app and after.

import { runPendingMigrations } from 'node-server-engine';
import { rollbackMigrations } from 'node-server-engine';

await runPendingMigrations();

await rollbackMigrations();

Environment Variables Verification

Environment variables verification can be done through the Server's checkEnvironment setting. It is an object defining how environment variables should be verified.

import { envAssert } from 'node-server-engine';

export const checkEnvironment = {
  ENV_VAR: envAssert.isString()
};

Assertions Available

The sever engine makes available a utility for environment variables assertions calls envAssert. The following example shows the different assertions that are possible.

Validator Description
isAfter(date]) check if the string is a date that's after the specified date (defaults to now).
isAlpha(locale, options]) check if the string contains only letters (a-zA-Z).

Locale is one of ['ar', 'ar-AE', 'ar-BH', 'ar-DZ', 'ar-EG', 'ar-IQ', 'ar-JO', 'ar-KW', 'ar-LB', 'ar-LY', 'ar-MA', 'ar-QA', 'ar-QM', 'ar-SA', 'ar-SD', 'ar-SY', 'ar-TN', 'ar-YE', 'bg-BG', 'cs-CZ', 'da-DK', 'de-DE', 'el-GR', 'en-AU', 'en-GB', 'en-HK', 'en-IN', 'en-NZ', 'en-US', 'en-ZA', 'en-ZM', 'es-ES', 'fr-FR', 'fa-IR', 'he', 'hu-HU', 'it-IT', 'ku-IQ', 'nb-NO', 'nl-NL', 'nn-NO', 'pl-PL', 'pt-BR', 'pt-PT', 'ru-RU', 'sl-SI', 'sk-SK', 'sr-RS', 'sr-RS@latin', 'sv-SE', 'tr-TR', 'uk-UA']) and defaults to en-US. Locale list is validator.isAlphaLocales. options is an optional object that can be supplied with the following key(s): ignore which can either be a String or RegExp of characters to be ignored e.g. " -" will ignore spaces and -'s.
isAlphanumeric(locale]) check if the string contains only letters and numbers.

Locale is one of ['ar', 'ar-AE', 'ar-BH', 'ar-DZ', 'ar-EG', 'ar-IQ', 'ar-JO', 'ar-KW', 'ar-LB', 'ar-LY', 'ar-MA', 'ar-QA', 'ar-QM', 'ar-SA', 'ar-SD', 'ar-SY', 'ar-TN', 'ar-YE', 'bg-BG', 'cs-CZ', 'da-DK', 'de-DE', 'el-GR', 'en-AU', 'en-GB', 'en-HK', 'en-IN', 'en-NZ', 'en-US', 'en-ZA', 'en-ZM', 'es-ES', 'fr-FR', 'fa-IR', 'he', 'hu-HU', 'it-IT', 'ku-IQ', 'nb-NO', 'nl-NL', 'nn-NO', 'pl-PL', 'pt-BR', 'pt-PT', 'ru-RU', 'sl-SI', 'sk-SK', 'sr-RS', 'sr-RS@latin', 'sv-SE', 'tr-TR', 'uk-UA']) and defaults to en-US. Locale list is validator.isAlphanumericLocales.
isAscii() check if the string contains ASCII chars only.
isBase32() check if a string is base32 encoded.
isBase58() check if a string is base58 encoded.
isBase64(options]) check if a string is base64 encoded. options is optional and defaults to {urlSafe: false}
when urlSafe is true it tests the given base64 encoded string is url safe
isBefore(date]) check if the string is a date that's before the specified date.
isBIC() check if a string is a BIC (Bank Identification Code) or SWIFT code.
isBoolean() check if a string is a boolean.
isBtcAddress() check if the string is a valid BTC address.
isByteLength(options]) check if the string's length (in UTF-8 bytes) falls in a range.

options is an object which defaults to {min:0, max: undefined}.
isCreditCard() check if the string is a credit card.
isCurrency(options]) check if the string is a valid currency amount.

options is an object which defaults to {symbol: '$', require_symbol: false, allow_space_after_symbol: false, symbol_after_digits: false, allow_negatives: true, parens_for_negatives: false, negative_sign_before_digits: false, negative_sign_after_digits: false, allow_negative_sign_placeholder: false, thousands_separator: ',', decimal_separator: '.', allow_decimal: true, require_decimal: false, digits_after_decimal: [2], allow_space_after_digits: false}.
Note: The array digits_after_decimal is filled with the exact number of digits allowed not a range, for example a range 1 to 3 will be given as [1, 2, 3].
isDataURI() check if the string is a data uri format.
isDate(input [, options]) Check if the input is a valid date. e.g. [2002-07-15, new Date()].

options is an object which can contain the keys format, strictMode and/or delimiters

format is a string and defaults to YYYY/MM/DD.

strictMode is a boolean and defaults to false. If strictMode is set to true, the validator will reject inputs different from format.

delimiters is an array of allowed date delimiters and defaults to ['/', '-'].
isDecimal(options]) check if the string represents a decimal number, such as 0.1, .3, 1.1, 1.00003, 4.0, etc.

options is an object which defaults to {force_decimal: false, decimal_digits: '1,', locale: 'en-US'}

locale determine the decimal separator and is one of ['ar', 'ar-AE', 'ar-BH', 'ar-DZ', 'ar-EG', 'ar-IQ', 'ar-JO', 'ar-KW', 'ar-LB', 'ar-LY', 'ar-MA', 'ar-QA', 'ar-QM', 'ar-SA', 'ar-SD', 'ar-SY', 'ar-TN', 'ar-YE', 'bg-BG', 'cs-CZ', 'da-DK', 'de-DE', 'el-GR', 'en-AU', 'en-GB', 'en-HK', 'en-IN', 'en-NZ', 'en-US', 'en-ZA', 'en-ZM', 'es-ES', 'fa', 'fa-AF', 'fa-IR', 'fr-FR', 'hu-HU', 'id-ID', 'it-IT', 'ku-IQ', 'nb-NO', 'nl-NL', 'nn-NO', 'pl-PL', 'pl-Pl', 'pt-BR', 'pt-PT', 'ru-RU', 'sl-SI', 'sr-RS', 'sr-RS@latin', 'sv-SE', 'tr-TR', 'uk-UA', 'vi-VN'].
Note: decimal_digits is given as a range like '1,3', a specific value like '3' or min like '1,'.
isDivisibleBy(number) check if the string is a number that's divisible by another.
isEAN() check if the string is an EAN (European Article Number).
isEmail(options]) check if the string is an email.

options is an object which defaults to { allow_display_name: false, require_display_name: false, allow_utf8_local_part: true, require_tld: true, allow_ip_domain: false, domain_specific_validation: false, blacklisted_chars: '' }. If allow_display_name is set to true, the validator will also match Display Name <email-address>. If require_display_name is set to true, the validator will reject strings without the format Display Name <email-address>. If allow_utf8_local_part is set to false, the validator will not allow any non-English UTF8 character in email address' local part. If require_tld is set to false, e-mail addresses without having TLD in their domain will also be matched. If ignore_max_length is set to true, the validator will not check for the standard max length of an email. If allow_ip_domain is set to true, the validator will allow IP addresses in the host part. If domain_specific_validation is true, some additional validation will be enabled, e.g. disallowing certain syntactically valid email addresses that are rejected by GMail. If blacklisted_chars recieves a string,then the validator will reject emails that include any of the characters in the string, in the name part.
isEmpty(options]) check if the string has a length of zero.

options is an object which defaults to { ignore_whitespace:false }.
isEthereumAddress() check if the string is an Ethereum address using basic regex. Does not validate address checksums.
isFloat(options]) check if the string is a float.

options is an object which can contain the keys min, max, gt, and/or lt to validate the float is within boundaries (e.g. { min: 7.22, max: 9.55 }) it also has locale as an option.

min and max are equivalent to 'greater or equal' and 'less or equal', respectively while gt and lt are their strict counterparts.

locale determine the decimal separator and is one of ['ar', 'ar-AE', 'ar-BH', 'ar-DZ', 'ar-EG', 'ar-IQ', 'ar-JO', 'ar-KW', 'ar-LB', 'ar-LY', 'ar-MA', 'ar-QA', 'ar-QM', 'ar-SA', 'ar-SD', 'ar-SY', 'ar-TN', 'ar-YE', 'bg-BG', 'cs-CZ', 'da-DK', 'de-DE', 'en-AU', 'en-GB', 'en-HK', 'en-IN', 'en-NZ', 'en-US', 'en-ZA', 'en-ZM', 'es-ES', 'fr-FR', 'hu-HU', 'it-IT', 'nb-NO', 'nl-NL', 'nn-NO', 'pl-PL', 'pt-BR', 'pt-PT', 'ru-RU', 'sl-SI', 'sr-RS', 'sr-RS@latin', 'sv-SE', 'tr-TR', 'uk-UA']. Locale list is validator.isFloatLocales.
isFQDN(options]) check if the string is a fully qualified domain name (e.g. domain.com).

options is an object which defaults to { require_tld: true, allow_underscores: false, allow_trailing_dot: false , allow_numeric_tld: false }.
isFullWidth() check if the string contains any full-width chars.
isHalfWidth() check if the string contains any half-width chars.
isHash(algorithm) check if the string is a hash of type algorithm.

Algorithm is one of ['md4', 'md5', 'sha1', 'sha256', 'sha384', 'sha512', 'ripemd128', 'ripemd160', 'tiger128', 'tiger160', 'tiger192', 'crc32', 'crc32b']
isHexadecimal() check if the string is a hexadecimal number.
isHexColor() check if the string is a hexadecimal color.
isHost() check if the string is a server host name.
isHostList() check if the string is a comma separated list of server host name.
isHSL() check if the string is an HSL (hue, saturation, lightness, optional alpha) color based on CSS Colors Level 4 specification.

Comma-separated format supported. Space-separated format supported with the exception of a few edge cases (ex: hsl(200grad+.1%62%/1)).
isIBAN() check if a string is a IBAN (International Bank Account Number).
isIdentityCard(locale]) check if the string is a valid identity card code.

locale is one of ['ES', 'IN', 'IT', 'NO', 'zh-TW', 'he-IL', 'ar-TN', 'zh-CN'] OR 'any'. If 'any' is used, function will check if any of the locals match.

Defaults to 'any'.
isIMEI(options])) check if the string is a valid IMEI number. Imei should be of format ############### or ##-######-######-#.

options is an object which can contain the keys allow_hyphens. Defaults to first format . If allow_hyphens is set to true, the validator will validate the second format.
isIn(values) check if the string is in a array of allowed values.
isInt(options]) check if the string is an integer.

options is an object which can contain the keys min and/or max to check the integer is within boundaries (e.g. { min: 10, max: 99 }). options can also contain the key allow_leading_zeroes, which when set to false will disallow integer values with leading zeroes (e.g. { allow_leading_zeroes: false }). Finally, options can contain the keys gt and/or lt which will enforce integers being greater than or less than, respectively, the value provided (e.g. {gt: 1, lt: 4} for a number between 1 and 4).
isIP(version]) check if the string is an IP (version 4 or 6).
isIPList() check if the string is a comma separated list of IP addresses (version 4 or 6).
isIPRange() check if the string is an IP Range(version 4 only).
isISBN(version]) check if the string is an ISBN (version 10 or 13).
isISIN() check if the string is an [ISIN][isin] (stock/security identifier).
isISO8601() check if the string is a valid ISO 8601 date; for additional checks for valid dates, e.g. invalidates dates like 2009-02-29, pass options object as a second parameter with options.strict = true.
isISO31661Alpha2() check if the string is a valid ISO 3166-1 alpha-2 officially assigned country code.
isISO31661Alpha3() check if the string is a valid ISO 3166-1 alpha-3 officially assigned country code.
isISRC() check if the string is a ISRC.
isISSN(options]) check if the string is an ISSN.

options is an object which defaults to { case_sensitive: false, require_hyphen: false }. If case_sensitive is true, ISSNs with a lowercase 'x' as the check digit are rejected.
isJSON(options]) check if the string is valid JSON (note: uses JSON.parse).

options is an object which defaults to { allow_primitives: false }. If allow_primitives is true, the primitives 'true', 'false' and 'null' are accepted as valid JSON values.
isJWT() check if the string is valid JWT token.
isLatLong(options]) check if the string is a valid latitude-longitude coordinate in the format lat,long or lat, long.

options is an object that defaults to { checkDMS: false }. Pass checkDMS as true to validate DMS(degrees, minutes, and seconds) latitude-longitude format.
isLength(options]) check if the string's length falls in a range.

options is an object which defaults to {min:0, max: undefined}. Note: this function takes into account surrogate pairs.
isLocale() check if the string is a locale
isLowercase() check if the string is lowercase.
isMACAddress() check if the string is a MAC address.

options is an object which defaults to {no_colons: false}. If no_colons is true, the validator will allow MAC addresses without the colons. Also, it allows the use of hyphens, spaces or dots e.g '01 02 03 04 05 ab', '01-02-03-04-05-ab' or '0102.0304.05ab'.
isMagnetURI() check if the string is a magnet uri format.
isMD5() check if the string is a MD5 hash.

Please note that you can also use the isHash('md5') function. Keep in mind that MD5 has some collision weaknesses compared to other algorithms (e.g., SHA).
isMimeType() check if the string matches to a valid MIME type format
isMobilePhone(locale [, options]]) check if the string is a mobile phone number,

(locale is either an array of locales (e.g ['sk-SK', 'sr-RS']) OR one of ['am-Am', 'ar-AE', 'ar-BH', 'ar-DZ', 'ar-EG', 'ar-IQ', ar-JO', 'ar-KW', 'ar-MA', 'ar-SA', 'ar-SY', 'ar-TN', 'az-AZ', 'az-LY', 'az-LB', 'bs-BA', 'be-BY', 'bg-BG', 'bn-BD', 'ca-AD', 'cs-CZ', 'da-DK', 'de-DE', 'de-AT', 'de-CH', 'de-LU', 'el-GR', 'en-AU', 'en-CA', 'en-GB', 'en-GG', 'en-GH', 'en-HK', 'en-MO', 'en-IE', 'en-IN', 'en-KE', 'en-MT', 'en-MU', 'en-NG', 'en-NZ', 'en-PK', 'en-PH', 'en-RW', 'en-SG', 'en-SL', 'en-UG', 'en-US', 'en-TZ', 'en-ZA', 'en-ZM', 'en-ZW', 'es-AR', 'es-BO', 'es-CL', 'es-CO', 'es-CR', 'es-DO', 'es-HN', 'es-PE', 'es-EC', 'es-ES', 'es-MX', 'es-PA', 'es-PY', 'es-UY', 'et-EE', 'fa-IR', 'fi-FI', 'fj-FJ', 'fo-FO', 'fr-BE', 'fr-FR', 'fr-GF', 'fr-GP', 'fr-MQ', 'fr-RE', 'ga-IE', 'he-IL', 'hu-HU', 'id-ID', 'it-IT', 'it-SM', 'ja-JP', 'ka-GE', 'kk-KZ', 'kl-GL', 'ko-KR', 'lt-LT', 'ms-MY', 'nb-NO', 'ne-NP', 'nl-BE', 'nl-NL', 'nn-NO', 'pl-PL', 'pt-BR', 'pt-PT', 'ro-RO', 'ru-RU', 'sl-SI', 'sk-SK', 'sq-AL', 'sr-RS', 'sv-SE', 'th-TH', 'tr-TR', 'uk-UA', 'uz-UZ', 'vi-VN', 'zh-CN', 'zh-HK', 'zh-MO', 'zh-TW'] OR defaults to 'any'. If 'any' or a falsey value is used, function will check if any of the locales match).

options is an optional object that can be supplied with the following keys: strictMode, if this is set to true, the mobile phone number must be supplied with the country code and therefore must start with +. Locale list is validator.isMobilePhoneLocales.
isMongoId() check if the string is a valid hex-encoded representation of a [MongoDB ObjectId][mongoid].
isMultibyte() check if the string contains one or more multi-byte chars.
isNumber check if the string contains only numbers.
isNumeric(options]) check if the string contains only numbers.

options is an object which defaults to {no_symbols: false} it also has locale as an option. If no_symbols is true, the validator will reject numeric strings that feature a symbol (e.g. +, -, or .).

locale determine the decimal separator and is one of ['ar', 'ar-AE', 'ar-BH', 'ar-DZ', 'ar-EG', 'ar-IQ', 'ar-JO', 'ar-KW', 'ar-LB', 'ar-LY', 'ar-MA', 'ar-QA', 'ar-QM', 'ar-SA', 'ar-SD', 'ar-SY', 'ar-TN', 'ar-YE', 'bg-BG', 'cs-CZ', 'da-DK', 'de-DE', 'en-AU', 'en-GB', 'en-HK', 'en-IN', 'en-NZ', 'en-US', 'en-ZA', 'en-ZM', 'es-ES', 'fr-FR', 'hu-HU', 'it-IT', 'nb-NO', 'nl-NL', 'nn-NO', 'pl-PL', 'pt-BR', 'pt-PT', 'ru-RU', 'sl-SI', 'sr-RS', 'sr-RS@latin', 'sv-SE', 'tr-TR', 'uk-UA'].
isOctal() check if the string is a valid octal number.
isPath() check if the string is a valid path.
isPassportNumber(countryCode) check if the string is a valid passport number.

(countryCode is one of [ 'AM', 'AR', 'AT', 'AU', 'BE', 'BG', 'BY', 'CA', 'CH', 'CN', 'CY', 'CZ', 'DE', 'DK', 'DZ', 'EE', 'ES', 'FI', 'FR', 'GB', 'GR', 'HR', 'HU', 'IE' 'IN', 'IS', 'IT', 'JP', 'KR', 'LT', 'LU', 'LV', 'MT', 'NL', 'PO', 'PT', 'RO', 'RU', 'SE', 'SL', 'SK', 'TR', 'UA', 'US' ].
isPort() check if the string is a valid port number.
isPostalCode(locale) check if the string is a postal code,

(locale is one of [ 'AD', 'AT', 'AU', 'AZ', 'BE', 'BG', 'BR', 'BY', 'CA', 'CH', 'CZ', 'DE', 'DK', 'DO', 'DZ', 'EE', 'ES', 'FI', 'FR', 'GB', 'GR', 'HR', 'HT', 'HU', 'ID', 'IE' 'IL', 'IN', 'IR', 'IS', 'IT', 'JP', 'KE', 'LI', 'LT', 'LU', 'LV', 'MT', 'MX', 'MY', 'NL', 'NO', 'NP', 'NZ', 'PL', 'PR', 'PT', 'RO', 'RU', 'SA', 'SE', 'SG', 'SI', 'TH', 'TN', 'TW', 'UA', 'US', 'ZA', 'ZM' ] OR 'any'. If 'any' is used, function will check if any of the locals match. Locale list is validator.isPostalCodeLocales.).
isRFC3339() check if the string is a valid RFC 3339 date.
isRgbColor(includePercentValues]) check if the string is a rgb or rgba color.

includePercentValues defaults to true. If you don't want to allow to set rgb or rgba values with percents, like rgb(5%,5%,5%), or rgba(90%,90%,90%,.3), then set it to false.
isSemVer() check if the string is a Semantic Versioning Specification (SemVer).
isString() check if the string is a string.
isStringList() check if the string is a comma separated list of string.
isSurrogatePair() check if the string contains any surrogate pairs chars.
isUppercase() check if the string is uppercase.
isSlug Check if the string is of type slug. Options allow a single hyphen between string. e.g. [cn-cn, cn-c-c]
isStrongPassword(options]) Check if a password is strong or not. Allows for custom requirements or scoring rules. If returnScore is true, then the function returns an integer score for the password rather than a boolean.
Default options:
{ minLength: 8, minLowercase: 1, minUppercase: 1, minNumbers: 1, minSymbols: 1, returnScore: false, pointsPerUnique: 1, pointsPerRepeat: 0.5, pointsForContainingLower: 10, pointsForContainingUpper: 10, pointsForContainingNumber: 10, pointsForContainingSymbol: 10 }
isTaxID(locale) Check if the given value is a valid Tax Identification Number. Default locale is en-US
isURL(options]) check if the string is an URL.

options is an object which defaults to { protocols: ['http','https','ftp'], require_tld: true, require_protocol: false, require_host: true, require_valid_protocol: true, allow_underscores: false, host_whitelist: false, host_blacklist: false, allow_trailing_dot: false, allow_protocol_relative_urls: false, disallow_auth: false }.

require_protocol - if set as true isURL will return false if protocol is not present in the URL.
require_valid_protocol - isURL will check if the URL's protocol is present in the protocols option.
protocols - valid protocols can be modified with this option.
require_host - if set as false isURL will not check if host is present in the URL.
require_port - if set as true isURL will check if port is present in the URL.
allow_protocol_relative_urls - if set as true protocol relative URLs will be allowed.
validate_length - if set as false isURL will skip string length validation (2083 characters is IE max URL length).
isUUID(version]) check if the string is a UUID (version 3, 4 or 5).
isVariableWidth() check if the string contains a mixture of full and half-width chars.
isWhitelisted(chars) checks characters if they appear in the whitelist.

Wait

Wait is used to make the process wait for the given seconds.

import { wait } from 'node-server-engine';

// This will make the process wait for 10 mins
await wait(600);

Development

Prerequisites

  • Node.js 18.x or higher
  • npm 9.x or higher
  • Git

Setup

# Clone the repository
git clone https://github.com/prakashmahendran/node-server-engine.git
cd node-server-engine

# Install dependencies
npm install

# Build the project
npm run build

# Run tests
npm test

# Run tests with coverage
npm run coverage

Scripts

Script Description
npm run build Compile TypeScript and generate distribution files
npm run lint Check code for linting errors
npm run lint:fix Automatically fix linting errors
npm run format Format code with Prettier
npm run format:check Check code formatting without modifying files
npm test Run all tests with coverage
npm run test:file Run a specific test file
npm run coverage Generate HTML coverage report
npm run coverage:ci Generate coverage summary for CI

Commit Convention

This project uses Conventional Commits for automated versioning and changelog generation.

Format: type(scope): subject

Types:

  • feat: New feature (triggers minor version bump)
  • fix: Bug fix (triggers patch version bump)
  • docs: Documentation changes
  • style: Code style changes (formatting, semicolons, etc.)
  • refactor: Code refactoring without feature changes
  • perf: Performance improvements
  • test: Adding or updating tests
  • chore: Maintenance tasks
  • ci: CI/CD changes

Examples:

git commit -m "feat(endpoints): add support for GraphQL endpoints"
git commit -m "fix(auth): resolve JWT token validation issue"
git commit -m "docs(readme): update installation instructions"

Breaking Changes: Add BREAKING CHANGE: in the commit body to trigger a major version bump:

git commit -m "feat(api): redesign authentication flow" -m "BREAKING CHANGE: AuthType enum values changed"

Automated Versioning

This project uses semantic-release for automated version management and package publishing.

How it works:

  1. Commits are analyzed based on conventional commit format
  2. Version number is automatically determined (major.minor.patch)
  3. Changelog is generated from commit messages
  4. Git tag is created
  5. Package is published to npm
  6. GitHub release is created

When releases happen:

  • Automatically on every push to main branch
  • Only if there are release-worthy commits (feat, fix, perf, BREAKING CHANGE)
  • Releases are skipped for commits with [skip ci] in the message

Contributing

We welcome contributions! Please follow these guidelines:

  1. Fork the repository and create your branch from main
  2. Follow the commit convention described above
  3. Write or update tests for your changes
  4. Ensure all tests pass before submitting
  5. Update documentation if needed
  6. Submit a pull request with a clear description

Code Style

  • This project uses ESLint and Prettier for code consistency
  • Run npm run lint:fix and npm run format before committing
  • Husky pre-commit hooks will automatically check your code

License

ISC © Ram

Support


Made with ❤️ for the Node.js community

About

No description, website, or topics provided.

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

 
 
 

Contributors

Languages

  • TypeScript 99.2%
  • Other 0.8%