Skip to content

goweraa/postman-api-testing-lab

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

2 Commits
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

Postman API Testing Lab

A portfolio-grade API testing project built with Postman collections and Newman (the Postman CLI runner). This lab tests the api-homelab stack — a Kong-managed gateway with three FastAPI microservices.

Tests run automatically on every push via GitHub Actions and publish HTML reports as build artifacts.


What This Lab Demonstrates

  • Writing structured API tests with Postman (collections, folders, variables, scripts)
  • Using pre-request scripts to generate dynamic test data (unique emails, SKUs)
  • Chaining requests — saving a created resource ID and using it in subsequent tests
  • Asserting status codes, response shapes, field values, and response times
  • Negative testing — validating that the API correctly rejects invalid input
  • Testing Kong gateway features: authentication, rate limiting, and correlation IDs
  • Chaos engineering tests — simulating upstream failures and verifying recovery
  • Running collections from the CLI with Newman and generating HTML reports
  • Automating the full test suite with GitHub Actions CI

Prerequisites

Tool Version Purpose
Node.js ≥ 18 Runs Newman
npm ≥ 9 Installs Newman
api-homelab running locally Target API stack

Newman and the HTML reporter install locally — no global install needed.


Quick Start

1. Start the target stack

The tests run against api-homelab. If you don't have it running:

git clone https://github.com/goweraa/api-homelab.git
cd api-homelab
cp .env.example .env
docker compose up -d --build
# Wait ~30s for health checks to pass
cd ..

2. Clone this repo and install Newman

git clone https://github.com/goweraa/postman-api-testing-lab.git
cd postman-api-testing-lab
npm install

3. Run the tests

# Run all three collections in sequence
npm test

# Or run a single collection
npm run test:users
npm run test:products
npm run test:weather

4. Open the HTML report

# Reports are written to reports/ after each run
open reports/users-api.html
open reports/products-api.html
open reports/weather-api.html

Project Structure

postman-api-testing-lab/
│
├── collections/
│   ├── users-api.collection.json       # 20 tests — user CRUD, auth, Kong features
│   ├── products-api.collection.json    # 18 tests — products, stock management, filters
│   └── weather-api.collection.json     # 18 tests — weather data, forecasts, chaos
│
├── environments/
│   ├── local.environment.json          # Points to http://localhost:8080
│   └── ci.environment.json             # Used by GitHub Actions (same values)
│
├── scripts/
│   └── wait-for-api.sh                 # Polls the gateway until it's ready (used in CI)
│
├── .github/
│   └── workflows/
│       └── api-tests.yml               # CI pipeline — starts api-homelab, runs tests, uploads reports
│
├── reports/                            # Generated at runtime — gitignored
├── package.json                        # Newman scripts
└── .env.example                        # Environment variable reference

Collections

Users API (collections/users-api.collection.json)

Happy Path — tests the full CRUD lifecycle in order:

# Request Assertion highlights
1 GET /api/v1/users Pagination envelope, required fields on each user
2 GET /api/v1/users?role=admin All returned users have role: admin
3 GET /api/v1/users?department=Engineering Partial department match works
4 GET /api/v1/users?page=1&page_size=2 pages calculation is correct
5 POST /api/v1/users 201 status, correct fields, saves created_user_id to variable
6 GET /api/v1/users/{{created_user_id}} Returns the exact user just created
7 PUT /api/v1/users/{{created_user_id}} Updated fields reflected; email unchanged
8 GET /api/v1/users/search/by-email Exact match returns correct user
9 DELETE /api/v1/users/{{created_user_id}} 204 with empty body
10 GET /api/v1/users/{{created_user_id}} 404 after deletion (confirms soft-delete)

Negative Cases:

Request Expected What's validated
GET /users/999999 404 Non-existent ID returns proper error
POST with alice@example.com 409 Duplicate email rejected with conflict detail
POST with not-a-valid-email 422 Invalid email format rejected
POST with role: superadmin 422 Invalid role value rejected
POST missing name field 422 Required field validation works
GET search by unknown email 404 Email lookup returns 404, not empty list

Kong Gateway Features:

Test What's validated
No API key 401 with error message
Wrong API key 401 with error message
Valid request X-Request-ID header present and is a valid UUID
Valid request X-RateLimit-Limit-Minute: 60 header present
Valid request API key value is not present anywhere in the response body (confirms Kong strips it)

Products API (collections/products-api.collection.json)

Happy Path:

# Request Assertion highlights
1 GET /api/v1/products Seed data present (≥ 8 products), pagination, field types
2 GET /products?category=Networking All results have category: Networking
3 GET /products?price_min=50&price_max=150 All prices within the specified range
4 GET /products?q=SSD Results contain "ssd" in name or description
5 GET /products?in_stock=true All results have stock > 0
6 GET /products/categories/list Array of strings, sorted alphabetically, includes expected categories
7 POST /api/v1/products 201 status, SKU auto-uppercased, saves created_product_id
8 GET /products/{{created_product_id}} Correct product returned
9 PUT /products/{{created_product_id}} Price and description updated; SKU immutable
10 PATCH /products/{{id}}/stock (delta: +10) Stock increases from 5 → 15
11 PATCH /products/{{id}}/stock (delta: -3) Stock decreases from 15 → 12
12 DELETE /products/{{created_product_id}} 204 with empty body
13 GET /products/{{created_product_id}} 404 after deletion

Negative Cases:

Request Expected What's validated
GET /products/999999 404 Non-existent product
POST with SKU RPI5-8GB (exists in seed) 409 Duplicate SKU rejected
POST with price: -10.00 422 Negative price rejected
PATCH stock delta: -99999 on a product 422 Stock cannot go below zero
GET /products (no API key) 401 Auth enforced on products too

Weather API (collections/weather-api.collection.json)

Happy Path:

# Request Assertion highlights
1 GET /weather/cities/supported ≥ 8 cities, each has slug/name/country, expected cities present
2 GET /weather/london Shape, temperature range (-60 to 60°C), humidity (0–100%)
3 GET /weather/kigali Country is Rwanda, tropical temperature range
4 GET /weather/tokyo/forecast 5 days, each day has date/temp_high/temp_low/condition, high ≥ low
5 GET /weather/berlin/forecast?days=14 Exactly 14 days returned

Negative Cases:

Request Expected What's validated
GET /weather/atlantis 404 Unsupported city slug rejected
GET /weather/london/forecast?days=99 422 Days exceeds maximum of 14
GET /weather/london (no API key) 401 Auth enforced

Chaos Engineering:

These tests are run in order — the chaos toggle requests depend on each other.

# Request Assertion
1 GET /chaos/slow?delay=3 200 status, responseTime > 2900ms
2 GET /chaos/error?code=500 Exactly 500 returned
3 GET /chaos/error?code=503 Exactly 503 returned
4 GET /chaos/error?code=429 Exactly 429 returned
5 GET /chaos/flaky?fail_rate=0.5 Accepts either 200 or 503 (non-deterministic)
6 POST /chaos/toggle chaos_mode_enabled: true in response
7 GET /weather/london 503 — all weather endpoints fail during chaos
8 POST /chaos/toggle chaos_mode_enabled: false in response
9 GET /weather/london 200 — service recovers after chaos disabled

GitHub Actions CI

The workflow in .github/workflows/api-tests.yml runs on every push to main and every pull request.

What it does:

  1. Checks out this repo and the api-homelab repo
  2. Generates a test .env for the api-homelab stack (CI-safe credentials)
  3. Starts the full 10-service api-homelab stack with docker compose up -d --build
  4. Waits up to 120 seconds for the gateway health check to pass
  5. Installs Newman
  6. Runs all three collections (continue-on-error: true so all collections run even if one fails)
  7. Uploads the HTML reports as a downloadable build artifact (retained for 30 days)
  8. Tears down and cleans all Docker volumes

Viewing reports: After a workflow run completes, go to the run on GitHub → Artifacts → download api-test-reports-<run-number>.zip. Unzip and open the .html files in a browser.


How Variables Work

The collections use collection variables to chain requests together. This avoids hardcoding IDs that change between test runs.

Users collection:

Variable Set by Used by
test_email Pre-request script on "Create User" (Date.now() timestamp) Create User body, Search by Email
created_user_id Test script on "Create User" (saves json.id) Get, Update, Delete requests

Products collection:

Variable Set by Used by
test_sku Pre-request script on "Create Product" (TEST-<timestamp>) Create Product body
created_product_id Test script on "Create Product" (saves json.id) Get, Update, Stock, Delete requests

Running Against a Different Environment

To test a staging or remote instance, duplicate environments/local.environment.json, change base_url, and pass it with the -e flag:

newman run collections/users-api.collection.json \
  -e environments/staging.environment.json \
  --reporters cli,htmlextra \
  --reporter-htmlextra-export reports/staging-users.html

Newman Reporter Output

Each npm run test:* command produces two files in reports/:

File Format Use
*.html Interactive HTML (htmlextra) Browser-based report with pass/fail breakdown, request/response bodies
*.json Machine-readable JSON Can be parsed for custom dashboards or badge generation

The HTML report from newman-reporter-htmlextra shows:

  • Total requests, assertions passed/failed
  • Per-folder and per-request breakdown
  • Full request and response details for failed tests
  • Response time graphs

About

Postman API test collections with Newman CLI and GitHub Actions CI — tests the api-homelab stack

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

 
 
 

Contributors

Languages