Test your data pipelines like you mean it.
A collection of GitHub Actions for building, testing, and managing cloud-native data workflows on AWS and GCP. Stop mocking cloud services—spin them up for real in your CI/CD pipeline.
Production-grade GitHub Actions that let you:
- Create real infrastructure in your workflows (SQS, SNS, Pub/Sub, Firehose, Glue)
- Test against actual services using LocalStack, Pub/Sub Emulator, or real cloud environments
- Clean up automatically with matching delete actions
- Move fast with simple, focused actions that do one thing well
Want context-aware editing with inline documentation? Try gate.predictr.io - a specialized editor for GitHub Actions workflows with built-in parameter documentation and action discovery.
"Why not just run aws or gcloud commands in a script step?"
Good question! Here's why actions are better:
- Type-safe inputs - Catch mistakes before runtime
- Clear error messages - Know exactly what went wrong, no cryptic cloud provider errors
- Self-documenting - Your workflow explains what it does
- IDE support - Autocomplete and validation in your editor
# ❌ Script approach - hard to read, easy to break
- run: |
aws sqs create-queue --queue-name test-queue \
--attributes VisibilityTimeout=60,MessageRetentionPeriod=345600 \
--tags Environment=test,Team=backend
QUEUE_URL=$(aws sqs get-queue-url --queue-name test-queue --query 'QueueUrl' --output text)
echo "queue_url=$QUEUE_URL" >> $GITHUB_OUTPUT
# ✅ Action approach - obvious and maintainable
- uses: predictr-io/aws-sqs-create-queue@v0
id: queue
with:
queue-name: 'test-queue'
visibility-timeout: '60'
tags: '{"Environment": "test", "Team": "backend"}'- Same patterns across all cloud services (AWS & GCP)
- No need to remember CLI syntax for each service
- Copy examples from README, adapt, done
- Works the same way in every repository
- Proper error handling and retries
- Input validation before hitting cloud APIs
- Output formatting ready for next steps
- Emulator/LocalStack support out of the box
- Actions use cloud SDKs directly (faster, smaller)
- No need to install/update CLI tools
- Consistent SDK versions
- Better performance in GitHub Actions runners
- Manually tested against real cloud services and emulators
- Bugs fixed once, benefit everyone
- Version pinning for stability
- Active maintenance and updates
- aws-cloudwatch-put-metrics - Publish custom metrics (track deployments, build times, test results)
- aws-sqs-create-queue - Create standard & FIFO queues
- aws-sqs-send-message - Send messages with attributes & deduplication
- aws-sqs-delete-queue - Clean up test queues
- aws-sns-create-topic - Create standard & FIFO topics
- aws-sns-send-message - Publish messages to topics
- aws-sns-delete-topic - Clean up test topics
- gcp-pubsub-create-topic - Create Pub/Sub topics with labels, retention, schemas
- gcp-pubsub-delete-topic - Delete Pub/Sub topics
- gcp-pubsub-create-subscription - Create pull/push subscriptions
- gcp-pubsub-delete-subscription - Delete subscriptions
- gcp-pubsub-send-message - Publish messages to topics
- aws-firehose-create-stream - Create delivery streams (simplified for testing)
- aws-firehose-send-message - Send records to streams
- aws-firehose-delete-stream - Clean up test streams
- aws-glue-crawler - Run crawlers to discover & catalog data
- aws-glue-partition-manager - Add, delete, check partition existence
name: Test Data Pipeline
on: [push]
jobs:
test:
runs-on: ubuntu-latest
services:
localstack:
image: localstack/localstack
ports:
- 4566:4566
env:
SERVICES: sqs,sns
steps:
- uses: actions/checkout@v4
# Create test infrastructure
- name: Create SQS queue
id: queue
uses: predictr-io/aws-sqs-create-queue@v0
env:
AWS_ENDPOINT_URL: http://localhost:4566
AWS_ACCESS_KEY_ID: test
AWS_SECRET_ACCESS_KEY: test
AWS_DEFAULT_REGION: us-east-1
with:
queue-name: 'test-queue'
# Run your tests
- name: Test pipeline
run: |
export QUEUE_URL="${{ steps.queue.outputs.queue-url }}"
npm test
# Clean up (always runs)
- name: Delete test queue
if: always()
uses: predictr-io/aws-sqs-delete-queue@v0
env:
AWS_ENDPOINT_URL: http://localhost:4566
AWS_ACCESS_KEY_ID: test
AWS_SECRET_ACCESS_KEY: test
AWS_DEFAULT_REGION: us-east-1
with:
queue-url: ${{ steps.queue.outputs.queue-url }}name: Test GCP Pipeline
on: [push]
jobs:
test:
runs-on: ubuntu-latest
services:
pubsub-emulator:
image: gcr.io/google.com/cloudsdktool/google-cloud-cli:emulators
ports:
- 8085:8085
steps:
- uses: actions/checkout@v4
# Create Pub/Sub topic
- name: Create topic
id: topic
uses: predictr-io/gcp-pubsub-create-topic@v0
env:
PUBSUB_EMULATOR_HOST: localhost:8085
with:
project-id: 'test-project'
topic-name: 'test-topic'
# Create subscription
- name: Create subscription
uses: predictr-io/gcp-pubsub-create-subscription@v0
env:
PUBSUB_EMULATOR_HOST: localhost:8085
with:
project-id: 'test-project'
topic-name: 'test-topic'
subscription-name: 'test-sub'
# Run your tests
- name: Test pipeline
run: npm test- Simple & Focused - Each action does one thing well
- Test-Friendly - Works with emulators and LocalStack out of the box
- Production-Ready - All actions work with real cloud environments too
- Clean Workflows - Matching create/delete actions for easy cleanup
- Multi-Cloud - Same patterns across AWS and GCP services
Found a bug? Want a new action? Contributions welcome!
- Open an issue to discuss your idea
- Submit a PR with tests
- Follow our existing patterns for consistency
All actions are MIT licensed. Use them freely!
Built with ☕ for teams who ship data products.
