Skip to content
Merged
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension


Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
62 changes: 62 additions & 0 deletions .github/workflows/ci.yml
Original file line number Diff line number Diff line change
@@ -0,0 +1,62 @@
name: CI

on:
push:
branches: ["main", "chore/**", "feature/**", "fix/**"]
pull_request:

jobs:
tests:
runs-on: ubuntu-latest
env:
PGHOST: localhost
PGPORT: 5432
PGUSER: postgres
PGDATABASE: postgres
PGPASSWORD: postgres
services:
postgres:
image: postgres:16
env:
POSTGRES_USER: postgres
POSTGRES_PASSWORD: postgres
POSTGRES_DB: postgres
ports:
- 5432:5432
options: >-
--health-cmd "pg_isready -U postgres"
--health-interval 10s
--health-timeout 5s
--health-retries 5
steps:
- name: Checkout
uses: actions/checkout@v4

- name: Install dependencies
run: |
sudo apt-get update
sudo apt-get install -y shellcheck jq postgresql-client

- name: Wait for PostgreSQL
run: |
for i in {1..10}; do
if pg_isready -h "$PGHOST" -p "$PGPORT" -U "$PGUSER"; then
exit 0
fi
sleep 2
done
echo "PostgreSQL did not become ready in time" >&2
exit 1

- name: ShellCheck automation scripts
run: shellcheck automation/*.sh

# Todo: enable full test suite when stable
# - name: Fast automation test suite
# run: ./automation/test_pgtools.sh --fast

# - name: HOT checklist JSON validation
# run: ./automation/run_hot_update_report.sh --format json --database "$PGDATABASE" --stdout

# - name: HOT checklist text validation
# run: ./automation/run_hot_update_report.sh --format text --database "$PGDATABASE" --stdout
6 changes: 6 additions & 0 deletions CONTRIBUTING.md
Original file line number Diff line number Diff line change
Expand Up @@ -22,6 +22,9 @@ git checkout -b feature/your-feature-name

# Test current scripts in your environment
./automation/test_pgtools.sh --database your_test_db

# Optional: run the full local pre-commit bundle
./scripts/precommit_checks.sh --database your_test_db
```

## Types of Contributions
Expand Down Expand Up @@ -154,6 +157,9 @@ psql -h localhost -p 5432 -U postgres -d postgres -f your_script.sql

# Test your specific changes
psql -d test_db -f your_new_script.sql

# Recommended: mirror CI locally
./scripts/precommit_checks.sh --database test_db
```

4. **Submit Pull Request**
Expand Down
File renamed without changes.
18 changes: 18 additions & 0 deletions README.md
Original file line number Diff line number Diff line change
Expand Up @@ -243,6 +243,24 @@ psql -U postgres -d mydb -f backup/backup_validation.sql
psql -U postgres -d mydb -f monitoring/connection_pools.sql
```

### Automation / HOT report verification
```bash
# Quick automation sanity check (connection, syntax, permissions)
./automation/test_pgtools.sh --fast

# Full automation suite with integration tests
./automation/test_pgtools.sh --full --verbose

# HOT checklist JSON validation
./automation/run_hot_update_report.sh --format json --database my_database --stdout

# HOT checklist text validation
./automation/run_hot_update_report.sh --format text --database my_database --stdout

# Full local pre-commit bundle
./scripts/precommit_checks.sh --database my_database
```

## Script Categories

- **Monitoring** - Database health, locks, replication, bloating
Expand Down
36 changes: 36 additions & 0 deletions automation/README.md
Original file line number Diff line number Diff line change
Expand Up @@ -12,6 +12,8 @@ This directory contains automation scripts for pgtools:
- `cleanup_reports.sh` - Report cleanup and log rotation
- `export_metrics.sh` - Metrics export for monitoring systems
- `test_pgtools.sh` - Testing framework and validation
- `run_hot_update_report.sh` - HOT update checklist (text or JSON, reads connection defaults from pgtools.conf)
- `scripts/precommit_checks.sh` - Local helper mirroring CI sanity checks
- `pgtools.conf.example` - Configuration template

## Quick Start
Expand All @@ -25,3 +27,37 @@ cp automation/pgtools.conf.example automation/pgtools.conf
```

For detailed usage and configuration options, please refer to the complete documentation linked above.

## Verification commands

Run these before committing changes to automation scripts or HOT reporting logic:

```bash
# Quick sanity check (connection, syntax, permissions)
./automation/test_pgtools.sh --fast

# Full automation suite with integration tests
./automation/test_pgtools.sh --full --verbose

# Verify HOT JSON workflow
./automation/run_hot_update_report.sh --format json --database my_database --stdout

# Verify HOT text workflow
./automation/run_hot_update_report.sh --format text --database my_database --stdout

# Full local bundle (shellcheck + automation + HOT)
./scripts/precommit_checks.sh --database my_database
```

## Connection configuration

Most automation scripts, including `run_hot_update_report.sh`, source `automation/pgtools.conf` for their database settings.

1. Copy the template: `cp automation/pgtools.conf.example automation/pgtools.conf`.
2. Populate standard libpq variables (PGHOST, PGPORT, PGUSER, PGDATABASE, optional PGPASSWORD or ~/.pgpass).
3. Override as needed:
- Command-line flags have highest priority (`--database analytics`).
- Environment variables (e.g., `PGHOST=staging-db`) override the config.
- Values in `pgtools.conf` act as defaults when nothing else is provided.

This precedence keeps existing automation jobs stable while still letting ad-hoc runs target alternate servers or databases.
11 changes: 8 additions & 3 deletions automation/cleanup_reports.sh
Original file line number Diff line number Diff line change
Expand Up @@ -31,6 +31,8 @@ COMPRESS_OLD="true"
# Load configuration
CONFIG_FILE="$SCRIPT_DIR/pgtools.conf"
if [[ -f "$CONFIG_FILE" ]]; then
# shellcheck disable=SC1091
# shellcheck source=pgtools.conf
source "$CONFIG_FILE"
KEEP_DAYS="${PGTOOLS_KEEP_REPORTS_DAYS:-$KEEP_DAYS}"
fi
Expand Down Expand Up @@ -180,7 +182,8 @@ clean_directory() {
while IFS= read -r -d '' file; do
((file_count++))
if command -v stat > /dev/null 2>&1; then
local size=$(stat -f%z "$file" 2>/dev/null || stat -c%s "$file" 2>/dev/null || echo "0")
local size
size=$(stat -f%z "$file" 2>/dev/null || stat -c%s "$file" 2>/dev/null || echo "0")
total_size=$((total_size + size))
fi

Expand Down Expand Up @@ -258,12 +261,14 @@ clean_cron_logs() {

# Keep only last 1000 lines of cron log
if [[ "$DRY_RUN" == "true" ]]; then
local current_lines=$(wc -l < "$cron_log")
local current_lines
current_lines=$(wc -l < "$cron_log")
if [[ "$current_lines" -gt 1000 ]]; then
log "Would truncate cron.log (currently $current_lines lines)"
fi
else
local temp_log=$(mktemp)
local temp_log
temp_log=$(mktemp)
tail -1000 "$cron_log" > "$temp_log" && mv "$temp_log" "$cron_log"
if [[ "$VERBOSE" == "true" ]]; then
log "Truncated cron.log to last 1000 lines"
Expand Down
9 changes: 6 additions & 3 deletions automation/export_metrics.sh
Original file line number Diff line number Diff line change
Expand Up @@ -8,7 +8,6 @@
set -euo pipefail

SCRIPT_DIR="$(cd "$(dirname "${BASH_SOURCE[0]}")" && pwd)"
PGTOOLS_ROOT="$(dirname "$SCRIPT_DIR")"

# Color codes
RED='\033[0;31m'
Expand Down Expand Up @@ -68,6 +67,8 @@ EOF
# Load configuration
CONFIG_FILE="$SCRIPT_DIR/pgtools.conf"
if [[ -f "$CONFIG_FILE" ]]; then
# shellcheck disable=SC1091
# shellcheck source=pgtools.conf
source "$CONFIG_FILE"
fi

Expand Down Expand Up @@ -129,7 +130,8 @@ check_database_connection() {

# Collect basic metrics
collect_metrics() {
local temp_file=$(mktemp)
local temp_file
temp_file=$(mktemp)

# Basic database metrics
psql -t -c "
Expand Down Expand Up @@ -254,7 +256,8 @@ format_json() {
# Format metrics for InfluxDB
format_influx() {
local metrics_file="$1"
local timestamp=$(date +%s)000000000 # nanoseconds
local timestamp
timestamp=$(date +%s)000000000 # nanoseconds

while IFS=$'\t' read -r metric value; do
if [[ -n "$metric" && -n "$value" ]]; then
Expand Down
50 changes: 34 additions & 16 deletions automation/pgtools_health_check.sh
Original file line number Diff line number Diff line change
Expand Up @@ -152,6 +152,8 @@ mkdir -p "$OUTPUT_DIR"
# Load configuration if available
if [[ -f "$CONFIG_FILE" ]]; then
log "Loading configuration from $CONFIG_FILE"
# shellcheck disable=SC1091
# shellcheck source=pgtools.conf
source "$CONFIG_FILE"
else
warn "Configuration file not found: $CONFIG_FILE"
Expand Down Expand Up @@ -211,6 +213,7 @@ TIMESTAMP=$(date '+%Y%m%d_%H%M%S')
REPORT_PREFIX="pgtools_health_check_${TIMESTAMP}"

# Define monitoring scripts to run
# shellcheck disable=SC2034 # referenced via nameref when selecting script set
declare -A ESSENTIAL_SCRIPTS=(
["Connection Analysis"]="monitoring/connection_pools.sql"
["Lock Analysis"]="monitoring/locks.sql"
Expand All @@ -219,6 +222,7 @@ declare -A ESSENTIAL_SCRIPTS=(
["Backup Validation"]="backup/backup_validation.sql"
)

# shellcheck disable=SC2034 # referenced via nameref when selecting script set
declare -A FULL_SCRIPTS=(
["Table Bloating"]="monitoring/bloating.sql"
["Buffer Performance"]="monitoring/buffer_troubleshoot.sql"
Expand All @@ -240,12 +244,19 @@ run_health_checks() {
scripts_to_run="ESSENTIAL_SCRIPTS"
else
log "Running full health check"
scripts_to_run="FULL_SCRIPTS"

# Add essential scripts to full run
# shellcheck disable=SC2034 # referenced through nameref
local -A combined_scripts=()
local key

for key in "${!FULL_SCRIPTS[@]}"; do
# shellcheck disable=SC2034
combined_scripts["$key"]="${FULL_SCRIPTS[$key]}"
done
# shellcheck disable=SC2034
for key in "${!ESSENTIAL_SCRIPTS[@]}"; do
FULL_SCRIPTS["$key"]="${ESSENTIAL_SCRIPTS[$key]}"
combined_scripts["$key"]="${ESSENTIAL_SCRIPTS[$key]}"
done
scripts_to_run="combined_scripts"
fi

local -n scripts_ref=$scripts_to_run
Expand All @@ -258,7 +269,8 @@ run_health_checks() {
# Create individual output files
for script_name in "${!scripts_ref[@]}"; do
local script_path="${PGTOOLS_ROOT}/${scripts_ref[$script_name]}"
local output_file="${OUTPUT_DIR}/${REPORT_PREFIX}_$(echo "$script_name" | tr ' ' '_' | tr '[:upper:]' '[:lower:]').txt"
local output_file
output_file="${OUTPUT_DIR}/${REPORT_PREFIX}_$(echo "$script_name" | tr ' ' '_' | tr '[:upper:]' '[:lower:]').txt"

if [[ -f "$script_path" ]]; then
if run_script "$script_path" "$script_name" "$output_file"; then
Expand Down Expand Up @@ -318,12 +330,14 @@ EOF
# Append individual script outputs
for output_file in "${OUTPUT_DIR}/${REPORT_PREFIX}"_*.txt; do
if [[ -f "$output_file" ]]; then
echo "--- $(basename "$output_file" .txt | sed 's/^.*_//; s/_/ /g') ---" >> "$report_file"
echo >> "$report_file"
cat "$output_file" >> "$report_file"
echo >> "$report_file"
echo "=============================================================================" >> "$report_file"
echo >> "$report_file"
{
echo "--- $(basename "$output_file" .txt | sed 's/^.*_//; s/_/ /g') ---"
echo
cat "$output_file"
echo
echo "============================================================================="
echo
} >> "$report_file"
fi
done
}
Expand Down Expand Up @@ -360,11 +374,12 @@ EOF
# Process individual script outputs
for output_file in "${OUTPUT_DIR}/${REPORT_PREFIX}"_*.txt; do
if [[ -f "$output_file" ]]; then
local section_name=$(basename "$output_file" .txt | sed 's/^.*_//; s/_/ /g')
local section_name
section_name=$(basename "$output_file" .txt | sed 's/^.*_//; s/_/ /g')
cat >> "$report_file" << EOF
<div class="section">
<div class="section-title">$section_name</div>
<div class="content">$(cat "$output_file" | sed 's/&/\&amp;/g; s/</\&lt;/g; s/>/\&gt;/g')</div>
<div class="content">$(sed 's/&/\&amp;/g; s/</\&lt;/g; s/>/\&gt;/g' "$output_file")</div>
</div>
EOF
fi
Expand Down Expand Up @@ -396,8 +411,10 @@ EOF
fi
first_section=false

local section_name=$(basename "$output_file" .txt | sed 's/^.*_//; s/_/ /g')
local content=$(cat "$output_file" | sed 's/\\/\\\\/g; s/"/\\"/g' | sed ':a;N;$!ba;s/\n/\\n/g')
local section_name
section_name=$(basename "$output_file" .txt | sed 's/^.*_//; s/_/ /g')
local content
content=$(sed 's/\\/\\\\/g; s/"/\\"/g' "$output_file" | sed ':a;N;$!ba;s/\n/\\n/g')

cat >> "$report_file" << EOF
{
Expand All @@ -424,7 +441,8 @@ send_notifications() {
fi

local report_file="${OUTPUT_DIR}/${REPORT_PREFIX}_consolidated_report.${FORMAT}"
local subject="PostgreSQL Health Check Report - $DB_NAME - $(date '+%Y-%m-%d %H:%M')"
local subject
subject="PostgreSQL Health Check Report - $DB_NAME - $(date '+%Y-%m-%d %H:%M')"

log "Sending email notifications to: $EMAIL_RECIPIENTS"

Expand Down
11 changes: 8 additions & 3 deletions automation/pgtools_scheduler.sh
Original file line number Diff line number Diff line change
Expand Up @@ -10,7 +10,6 @@
set -euo pipefail

SCRIPT_DIR="$(cd "$(dirname "${BASH_SOURCE[0]}")" && pwd)"
PGTOOLS_ROOT="$(dirname "$SCRIPT_DIR")"

# Color codes
RED='\033[0;31m'
Expand Down Expand Up @@ -62,11 +61,15 @@ EOF
# Load configuration
CONFIG_FILE="$SCRIPT_DIR/pgtools.conf"
if [[ -f "$CONFIG_FILE" ]]; then
# shellcheck disable=SC1091
# shellcheck source=pgtools.conf
source "$CONFIG_FILE"
else
warn "Configuration file not found: $CONFIG_FILE"
warn "Using defaults and example configuration"
if [[ -f "$SCRIPT_DIR/pgtools.conf.example" ]]; then
# shellcheck disable=SC1091
# shellcheck source=pgtools.conf.example
source "$SCRIPT_DIR/pgtools.conf.example"
fi
fi
Expand Down Expand Up @@ -108,7 +111,8 @@ install_cron_jobs() {
fi

# Generate temporary cron file
local temp_cron=$(mktemp)
local temp_cron
temp_cron=$(mktemp)

# Get existing crontab (excluding pgtools entries)
if crontab -l > /dev/null 2>&1; then
Expand Down Expand Up @@ -147,7 +151,8 @@ remove_cron_jobs() {
crontab -l > "$SCRIPT_DIR/crontab.backup.$(date +%Y%m%d_%H%M%S)"

# Generate temporary cron file without pgtools entries
local temp_cron=$(mktemp)
local temp_cron
temp_cron=$(mktemp)
crontab -l | grep -v "PostgreSQL Tools" | grep -v "pgtools" > "$temp_cron" || true

# Install cleaned crontab
Expand Down
Loading