A comprehensive automated vulnerability scanner with AI-powered recursive investigation, subdomain enumeration, and vulnerability assessment
Features • Installation • Recent Updates • Configuration • Usage
- Project Overview
- Recent Major Updates
- Key Features
- Architecture
- Security Tools Integrated
- Prerequisites
- Quick Start
- Asset Management System
- Authentication System
- API Keys Setup
- Configuration
- Subfinder Pipeline Setup
- ZAP Setup
- Running the Application
- Development Features
- Troubleshooting
This Vulnerability Scanner is a full-stack application designed for comprehensive security reconnaissance and vulnerability assessment. It combines multiple industry-standard security tools with AI-powered analysis to provide deep insights into your target's security posture.
Frontend:
- ⚛️ React 18 + TypeScript
- 🎨 TailwindCSS for modern UI
- 📱 Vite 6.3.5 for fast builds with HMR
- 🗺️ Mapbox GL for geolocation visualization
- 🔔 react-hot-toast for notifications
- 🎯 lucide-react for icons
Backend:
- 🐍 Python 3.11 + Flask 3.1.0
- 🔐 JWT authentication with HTTP-only cookies
- 🗄️ MongoDB for data persistence
- 🔄 Celery for asynchronous task execution
- 📨 Redis as message broker
- 📊 Real-time scan streaming
DevOps:
- 🐳 Docker + Docker Compose
- 🔄 Hot-reload enabled for development
- 🌐 Multi-container architecture
- 📦 Environment-aware tool execution
⚠️ LEGAL WARNING: This tool is intended for authorized security testing only. Scanning targets without explicit permission is illegal and unethical. Always obtain proper authorization before conducting any security assessments.
New Features:
- Full CRUD Operations: Create, Read, Update, Delete assets
- Dark Theme UI: Matching application's #1a1f3a background with #252b48 containers
- Click-to-View Modal: Detailed asset information display with organized sections
- Sidebar Navigation: Quick access to all saved assets with counts
- Specific API Keys: Separate fields for Shodan and FOFA keys with hints
- Optional Field Labels: Clear indication of required vs optional fields
Technical Implementation:
-
Frontend:
Frontend/src/components/asset/AssetForm.tsx(627 lines)- Cookie-based authentication using
credentials: 'include' - Comprehensive null/undefined safety checks
- Fallback handling for legacy database schemas
- Real-time asset list updates after operations
- Cookie-based authentication using
-
Backend:
Backend/main.py- POST
/api/assets(Line 489-540): Create assets with validation - GET
/api/assets(Line 543-558): Fetch user's assets - DELETE
/api/assets/<id>(Line 559-580): Delete with authorization - All endpoints use
@jwt_required()decorator
- POST
Data Structure:
{
"_id": ObjectId,
"user_id": String,
"companyName": String (required),
"domains": Array<String>,
"ipAddresses": Array<String>,
"endpoints": Array<String>,
"shodanKey": String (optional),
"fofaKey": String (optional),
"created_at": DateTime,
"updated_at": DateTime
}Previous Issue: Application was looking for localStorage tokens that don't exist.
Solution:
- ✅ Converted to HTTP-only cookie authentication
- ✅ All fetch requests now use
credentials: 'include' - ✅ Removed localStorage token logic completely
- ✅ Automatic cookie management by browser
Files Modified:
Frontend/src/components/asset/AssetForm.tsx: All API calls use cookiesFrontend/src/context/AuthContext.tsx: Uses/auth/meendpointBackend/main.py: JWT stored in HTTP-only cookies
Benefits:
- 🛡️ More secure (XSS-resistant)
- 🔄 Automatic token refresh
- 🚫 No manual token management needed
All Tools Now Fully Functional:
| Tool | Version | Purpose | Status |
|---|---|---|---|
| Nmap | 7.95 | Port scanning & service detection | ✅ Working |
| Nikto | Latest | Web server vulnerability scanning | ✅ Working |
| SQLMap | Latest | SQL injection detection | ✅ Working |
| FFUF | 2.1.0 | Directory bruteforcing | ✅ Working |
| testssl.sh | Latest | SSL/TLS security testing | ✅ Working |
| WPScan | Latest | WordPress vulnerability scanning | ✅ Working |
| WhatWeb | Latest | Technology fingerprinting | ✅ Working |
| Subfinder | Latest | Subdomain enumeration | ✅ Working |
| DNSx | Latest | DNS validation | ✅ Working |
| HTTPx | Latest | HTTP probing | ✅ Working |
| Nuclei | Latest | Vulnerability scanning | ✅ Working |
Installation Methods:
# Dockerfile.backend
# System packages (apt)
RUN apt update && apt install -y \
curl unzip git wget \
ruby ruby-dev \
whatweb \
dnsutils iputils-ping nmap
# GitHub installations
RUN git clone --depth 1 https://github.com/sullo/nikto /opt/nikto
RUN git clone --depth 1 https://github.com/sqlmapproject/sqlmap.git /opt/sqlmap
RUN git clone --depth 1 https://github.com/drwetter/testssl.sh.git /opt/testssl.sh
# Binary installations
RUN curl -L https://github.com/ffuf/ffuf/releases/download/v2.1.0/ffuf_2.1.0_linux_amd64.tar.gz | \
tar -xzf - -C /usr/local/binCross-Platform Tool Execution:
Created Backend/utils/automated_tools/env_helper.py for automatic environment detection:
def is_docker():
"""Detects if running in Docker container"""
if os.path.exists('/.dockerenv'):
return True
with open('/proc/1/cgroup', 'r') as f:
return 'docker' in f.read()
def get_command_prefix():
"""Returns appropriate command prefix"""
if is_docker():
return [] # Direct execution
elif os.name == 'nt':
return ['wsl', '-d', 'Ubuntu-22.04'] # Windows WSL
else:
return [] # Native LinuxBenefits:
- ✅ Works in Docker containers
- ✅ Works on Windows with WSL
- ✅ Works on native Linux
- ✅ No hardcoded paths
- ✅ Automatic path detection
Files Using Environment Detection:
Backend/nmap.py: Nmap execution with 300s timeoutBackend/ffuf.py: FFUF path detectionBackend/utils/automated_tools/wpscan.py: WPScan wrapperBackend/utils/automated_tools/nikto.py: Nikto wrapperBackend/utils/automated_tools/testssl_runner.py: testssl.sh wrapperBackend/utils/automated_tools/sqlmap.py: SQLMap wrapper (NEW)
New File: Backend/utils/automated_tools/sqlmap.py
Features:
- ✅ Environment-aware execution
- ✅ 120-second timeout with threading
- ✅ ANSI color code stripping
- ✅ Intelligent result summarization
- ✅ Vulnerability detection in output
Usage:
from Backend.utils.automated_tools.sqlmap import run_sqlmap_scan
result = run_sqlmap_scan("sqlmap -u 'https://example.com' --batch --risk=1")Integration:
- Added to
Backend/utils/tool_executor.py - Accessible via LLM command suggestions
- Automatic timeout handling
Nmap Timeout Extended:
- Before: 60 seconds (too short for comprehensive scans)
- After: 300 seconds (5 minutes)
- File:
Backend/nmap.pyline ~35
Why This Matters:
- Large networks need time to scan
- Prevents false "timeout" errors
- Allows thorough service detection
Enabled Flask Debug Mode:
# Dockerfile.backend
ENV FLASK_ENV=development
ENV FLASK_DEBUG=1In main.py:
app.run(host="0.0.0.0", port=5000, debug=True)Benefits:
- ✅ Code changes reload automatically
- ✅ No need to restart containers
- ✅ Faster development cycle
- ✅ Better error messages
New File: Frontend/src/config/api.ts
Purpose: Single source of truth for API endpoints
export const API_CONFIG = {
baseURL: import.meta.env.VITE_API_BASE_URL || 'http://localhost:5000'
};
// Usage in components:
fetch(`${API_CONFIG.baseURL}/api/assets`)Files Using API_CONFIG:
Frontend/src/components/asset/AssetForm.tsxFrontend/src/components/scan/*.tsxFrontend/src/context/AuthContext.tsx
Benefits:
- ✅ Easy environment switching
- ✅ No hardcoded URLs
- ✅ Type-safe imports
Complete Asynchronous Task Processing System
Infrastructure Added:
- ✅ Redis - Message broker on port 6379
- ✅ Celery Worker - Background task executor (12 concurrent workers)
- ✅ Celery Beat - Task scheduler for periodic scans
Docker Services:
# docker-compose.yml
redis:
image: redis:7-alpine
ports:
- "6379:6379"
celery-worker:
command: celery -A tasks worker --loglevel=info
depends_on:
- redis
- mongo
celery-beat:
command: celery -A tasks beat --loglevel=info
depends_on:
- redis
- celery-workerScheduled Tasks:
- 📅 Daily Subdomain Scan - Runs at 3:40 PM IST
- 🎯 Domain: Configurable via
AUTO_SCAN_DOMAINin.env - 📊 Auto-saves to MongoDB collection
scan_results_subfinder
New API Endpoints:
1. Trigger Background Scan
POST /api/trigger_background_scan
Authorization: Bearer <jwt_token>
Content-Type: application/json
{
"domain": "example.com" // Optional, uses AUTO_SCAN_DOMAIN if not provided
}
Response (202):
{
"status": "success",
"message": "Background scan queued for domain: example.com",
"task_id": "6356dd8d-ff14-4e79-bb2e-4091e1699638"
}2. Check Scan Status
GET /api/background_scan_status/<task_id>
Authorization: Bearer <jwt_token>
Response:
{
"status": "completed",
"result": {
"status": "completed",
"domain": "ds.study.iitm.ac.in",
"subdomains_found": 1,
"subdomains_stored": 1,
"scan_id": "20260115151710_ce5b67d6-60ac-43ac-b2d2-5eb624ff2cf2"
}
}Improved Logging:
[AUTO-SCAN] ✅ Starting scan session: 20260115151710_ce5b... for domain: example.com
[AUTO-SCAN] 📍 Using tools from: /app/tools
[AUTO-SCAN] 🔍 Running subfinder for domain: example.com
[AUTO-SCAN] ✅ Found 5 subdomains
[AUTO-SCAN] 🔄 Processing subdomain 1/5: api.example.com
[AUTO-SCAN] ✅ DNSX completed for api.example.com
[AUTO-SCAN] ✅ HTTPX completed for api.example.com
[AUTO-SCAN] 🔍 Running Nuclei scan for https://api.example.com
[AUTO-SCAN] ✅ Nuclei found 2 vulnerabilities
[AUTO-SCAN] ✅ Saved to MongoDB (1/5)
[AUTO-SCAN] 🎉 SCAN COMPLETED for example.com
[AUTO-SCAN] 📊 Results: 5/5 subdomains stored successfully
Files Modified:
Backend/tasks.py- Celery task definitionsBackend/utils/scanner_task_runner.py- Improved logging, fixed subprocess callsBackend/main.py- Added background scan API endpointsdocker-compose.yml- Added redis, celery-worker, celery-beat services.env- AddedAUTO_SCAN_DOMAIN,CELERY_BROKER_URL,CELERY_RESULT_BACKEND
Configuration:
# .env file
AUTO_SCAN_DOMAIN=ds.study.iitm.ac.in
CELERY_BROKER_URL=redis://redis:6379/0
CELERY_RESULT_BACKEND=redis://redis:6379/0
TZ=Asia/KolkataMonitoring Commands:
# Check running containers
docker ps
# Watch Celery worker logs
docker logs celery-worker -f
# Watch Celery beat (scheduler) logs
docker logs celery-beat -f
# Test Redis connectivity
docker exec -it redis redis-cli ping
# Manually trigger a scan
docker exec celery-worker python -c "from tasks import periodic_subdomain_scan; periodic_subdomain_scan.delay('example.com')"Benefits:
- ✅ Non-blocking scans - API returns immediately
- ✅ Scheduled periodic scans
- ✅ Task status tracking
- ✅ Scalable architecture
- ✅ Clear error messages with emojis
- ✅ Result persistence in Redis
-
✅ Subdomain Enumeration
- Knockpy integration for comprehensive subdomain discovery
- Subfinder + DNSx + HTTPx Pipeline - Modern automated workflow
- Subfinder: Multi-source passive subdomain discovery
- DNSx: Fast DNS resolution and validation
- HTTPx: HTTP/HTTPS probing with technology detection
- Nuclei: Automated vulnerability scanning
- Certificate transparency log analysis
- DNS record enumeration
- Real-time streaming results
-
✅ Network Analysis
- Nmap port scanning with service detection
- HTTPx for HTTP/HTTPS probing
- DNSx for DNS enumeration
- Banner grabbing and service fingerprinting
-
✅ Technology Detection
- WhatWeb for CMS and technology identification
- Wappalyzer integration
- HTTP header analysis
- SSL/TLS certificate inspection
-
✅ Infrastructure Mapping
- IP geolocation and ASN tracking
- DNS and mail server diagnostics via MXToolbox API
- Certificate chain validation
- WHOIS information gathering
-
🔒 OWASP ZAP Integration
- Spider scan for comprehensive crawling
- Active vulnerability scanning
- Passive vulnerability detection
- Custom scan policies
-
⚙️ Automated Scanning
- Celery-based asynchronous task execution
- Redis message broker for task queuing
- Parallel scan execution
- Scheduled periodic scans
-
🎯 Security Testing
- Nikto web server scanner
- Directory brute-forcing with FFuF
- SSL/TLS security testing with testssl.sh
- WordPress vulnerability scanning with WPScan
-
📄 Comprehensive PDF Reports
- Executive summary with risk ratings
- Detailed findings with CVE references
- Remediation recommendations
- Evidence screenshots and proof-of-concept
-
📈 Historical Analysis Dashboard
- Subdomain growth tracking over time
- Service and port change detection
- Technology stack evolution
- Certificate lifecycle monitoring
- Vulnerability trend analysis
-
🔔 Email Notifications
- Scan completion alerts
- Critical vulnerability notifications
- Account verification
- Password reset functionality
-
🔐 Authentication System
- JWT-based authentication
- Email verification with OTP
- Secure password hashing
- Session management
-
📊 Multi-tenant Support
- User-specific scan history
- Role-based access control
- Asset management per user
- Private scan results
┌─────────────────────────────────────────────────────────────────┐
│ Frontend (React) │
│ ┌──────────────┐ ┌──────────────┐ ┌────────────────────┐ │
│ │ Dashboard │ │ Scan UI │ │ Report Viewer │ │
│ └──────────────┘ └──────────────┘ └────────────────────┘ │
└────────────────────────────┬────────────────────────────────────┘
│ REST API
┌────────────────────────────┴────────────────────────────────────┐
│ Backend (Flask) │
│ ┌──────────────┐ ┌──────────────┐ ┌────────────────────┐ │
│ │ API Routes │ │ Auth System │ │ PDF Generator │ │
│ └──────┬───────┘ └──────────────┘ └────────────────────┘ │
│ │ │
│ ┌──────┴───────────────────────────────────────────────────┐ │
│ │ Celery Task Queue │ │
│ │ ┌────────┐ ┌────────┐ ┌────────┐ ┌────────┐ │ │
│ │ │ Nmap │ │Knockpy │ │ ZAP │ │ FFuF │ ... │ │
│ │ └────────┘ └────────┘ └────────┘ └────────┘ │ │
│ └──────────────────────────────────────────────────────────┘ │
└────────────┬────────────────┬──────────────────┬────────────────┘
│ │ │
┌────────┴────────┐ ┌───┴─────┐ ┌────────┴─────────┐
│ MongoDB │ │ Redis │ │ External APIs │
│ (Database) │ │ (Broker)│ │ (Cohere, etc.) │
└─────────────────┘ └─────────┘ └──────────────────┘
The application now includes centralized configuration modules for better maintainability and security.
Frontend/
├── .env.example # Environment variable template
├── .env # Your local config (not in git)
├── src/
│ ├── config/
│ │ ├── api.ts # API endpoint configuration
│ │ └── mapbox.ts # Mapbox token configuration
│ ├── components/
│ └── ...
Centralizes all API endpoints and provides helper functions:
import { getApiUrl, API_CONFIG } from '@/config/api';
// Use centralized configuration
const response = await fetch(
getApiUrl(API_CONFIG.endpoints.scanSubdomain),
{ method: 'POST', body: JSON.stringify(data) }
);Benefits:
- ✅ Single source of truth for API URLs
- ✅ Environment-based configuration (dev/staging/prod)
- ✅ Easy to update all endpoints at once
- ✅ Type-safe endpoint references
Manages Mapbox token and map settings:
import { MAPBOX_CONFIG } from '@/config/mapbox';
import mapboxgl from 'mapbox-gl';
// Use configured token
mapboxgl.accessToken = MAPBOX_CONFIG.accessToken;Benefits:
- ✅ Secure token management via environment variables
- ✅ Centralized map configuration (style, zoom, defaults)
- ✅ Token validation helpers
- ✅ Easy to switch between development/production tokens
For developers updating existing code, see Frontend/MIGRATION_EXAMPLE.ts for conversion examples from hardcoded URLs to the new configuration system.
The AI-driven loop makes reconnaissance iterative and focused while maintaining safety controls:
graph LR
A[Initial Recon] --> B[LLM Analysis]
B --> C[Generate Commands]
C --> D[Sandboxed Execution]
D --> E[Results Analysis]
E --> B
- 📥 Initial Analysis: Feed reconnaissance output (subdomains, open ports, headers, certificates, tech stack) into an LLM
- 💡 Smart Suggestions: LLM suggests next-step investigative commands (targeted HTTP probes, specific ZAP scans, banner grabs)
- 🔐 Sandboxed Execution: Execute suggested commands in a controlled, rate-limited environment
- 🔄 Iterative Refinement: Return results to LLM for analysis and refined recommendations
- ♻️ Recursive Loop: Each iteration becomes smarter and more focused based on previous findings
🔒 Critical Safety Note: Run the AI loop only in a tightly controlled sandbox with strict network egress controls, resource limits, and explicit scope/rate restrictions to prevent accidental or abusive scanning. Logging, human approval gates, and kill-switches are strongly recommended.
Before you begin, ensure you have the following installed on your system:
| Software | Minimum Version | Purpose | Installation Link |
|---|---|---|---|
| Python | 3.9+ | Backend runtime | Download Python |
| Node.js | 16+ | Frontend runtime | Download Node.js |
| npm | 8+ | Package manager | Comes with Node.js |
| MongoDB | 5.0+ | Database | Download MongoDB |
| Redis | 6.0+ | Task queue broker | Download Redis |
| Docker | 20+ | Container runtime (optional) | Download Docker |
| Git | 2.30+ | Version control | Download Git |
These tools should be installed and accessible in your system PATH or specified in TOOLS_DIR:
| Tool | Purpose | Installation |
|---|---|---|
| Nmap | Port scanning | sudo apt install nmap or Download |
| Subfinder | Subdomain enumeration | See Subfinder Setup below |
| HTTPx | HTTP probing | See Subfinder Setup below |
| DNSx | DNS enumeration | See Subfinder Setup below |
| Nuclei | Vulnerability scanner | See Subfinder Setup below |
| Knockpy | DNS reconnaissance | pip install knockpy |
| WhatWeb | Technology detection | sudo apt install whatweb |
| Nikto | Web vulnerability scanner | sudo apt install nikto |
| FFuF | Fuzzing tool | GitHub |
| testssl.sh | SSL/TLS testing | GitHub |
| WPScan | WordPress scanner | GitHub |
| OWASP ZAP | Vulnerability scanner | See ZAP Setup below |
You'll need API keys from the following services:
-
MXToolbox API - DNS and mail server diagnostics
- Sign up: https://mxtoolbox.com/api/
- Free tier available
-
DNSDumpster API - DNS reconnaissance
- Sign up: https://dnsdumpster.com/
- API documentation available on site
-
Cohere API - AI-powered analysis
- Sign up: https://cohere.ai/
- Free trial available with generous limits
-
SMTP Server - Email notifications
- Gmail App Password (recommended)
- Or any SMTP provider
This is the easiest way to get started with all services pre-configured.
git clone https://github.com/Omjee73/Vulnerability_Scanner.git
cd Vulnerability_ScannerCreate a .env file in the root directory:
# Copy the example file
cp .envexample .env
# Edit with your preferred editor
nano .env # or notepad .env on WindowsFill in your configuration (see Configuration section for details).
# Build and start all services
docker-compose up --build
# Or run in detached mode
docker-compose up -dThis will start:
- Frontend on
http://localhost:3000 - Backend API on
http://localhost:5000 - MongoDB on
localhost:27017 - Redis on
localhost:6379
Open your browser and navigate to:
http://localhost:3000
Create an account and start scanning!
💡 Quick Tip: For Subfinder scanning, you'll need to install additional tools. See the Subfinder Pipeline Setup section for detailed instructions.
For development or customization, you can install components individually.
git clone https://github.com/Omjee73/Vulnerability_Scanner.git
cd Vulnerability_Scanner# Navigate to Backend directory
cd Backend
# Create virtual environment
python -m venv venv
# Activate virtual environment
# On Windows:
venv\Scripts\activate
# On macOS/Linux:
source venv/bin/activate
# Install Python dependencies
pip install --upgrade pip
pip install -r requirements.txt
# Create .env file
cp .envexample .env
# Edit .env with your configuration (see Configuration section)Open a new terminal:
# Navigate to Frontend directory
cd Frontend
# Install Node.js dependencies
npm install
# Create production build (optional)
npm run build
# Or start development server
npm run dev# On Linux/macOS:
sudo systemctl start mongod
# On Windows (if installed as service):
net start MongoDB
# Or using Docker:
docker run -d -p 27017:27017 --name mongodb mongo:latest# On Linux:
sudo systemctl start redis
# On macOS:
brew services start redis
# On Windows or using Docker:
docker run -d -p 6379:6379 --name redis redis:latestIn your backend terminal (with virtual environment activated):
# Start Flask application
python main.pyThe backend API will be available at http://localhost:5000
Open another terminal in the Backend directory:
# Activate virtual environment
source venv/bin/activate # or venv\Scripts\activate on Windows
# Start Celery worker for task processing
celery -A celery_worker.celery_app worker --loglevel=info --pool=solo
# Optional: Start Celery Beat for scheduled tasks (in another terminal)
celery -A celery_worker.celery_app beat --loglevel=infoNote for Windows Users: Use
--pool=soloflag with Celery worker on Windows as it doesn't support the default prefork pool.
Open your browser and navigate to:
http://localhost:3000
The application now includes a comprehensive Asset Management system for organizing and tracking your security testing targets.
- ✅ Create, Read, Update, Delete (CRUD) operations
- ✅ Organized Storage - All assets linked to your user account
- ✅ Quick Access - Click any asset to view full details
- ✅ API Key Management - Store Shodan and FOFA keys per asset
- ✅ Multi-Target Support - Multiple domains, IPs, and endpoints per asset
Each asset can contain:
| Field | Type | Required | Description |
|---|---|---|---|
| Company Name | String | ✅ Yes | Organization or project name |
| Domains | Array | ❌ Optional | List of domain names (e.g., example.com) |
| IP Addresses | Array | ❌ Optional | List of IP addresses to scan |
| Endpoints | Array | ❌ Optional | Specific URLs or API endpoints |
| Shodan Key | String | ❌ Optional | Your Shodan API key for this asset |
| FOFA Key | String | ❌ Optional | Your FOFA API key for this asset |
From the dashboard, click "Assets" in the navigation menu or visit:
http://localhost:3000/assets
Click the "Add New Asset" button and fill in the form:
Company Name: Acme Corporation [Required]
Domains: acme.com, www.acme.com [Optional - comma-separated]
IP Addresses: 192.168.1.1, 10.0.0.1 [Optional - comma-separated]
Endpoints: https://api.acme.com/v1 [Optional - comma-separated]
Shodan API Key: ********************** [Optional]
FOFA API Key: ************************ [Optional]
Click "Save Asset" to store.
Click on any asset card in the list to open a modal with full details:
- All domains, IPs, and endpoints displayed
- API keys shown (with masking for security)
- Creation and update timestamps
- Quick copy buttons for values
Click the trash icon (🗑️) on an asset card and confirm deletion.
┌─────────────────────────────────────────────────────────────┐
│ Frontend UI │
│ ┌────────────┐ ┌────────────┐ ┌────────────┐ │
│ │ Asset Form │ │ Asset List │ │ Asset Modal│ │
│ └─────┬──────┘ └─────┬──────┘ └─────┬──────┘ │
└────────┼───────────────┼───────────────┼──────────────────┘
│ │ │
▼ ▼ ▼
POST /api/assets GET /api/assets View Details
│ │
▼ ▼
┌─────────────────────────────────────────────────────────────┐
│ Backend API (Flask) │
│ ┌──────────────────────────────────────────────────────┐ │
│ │ @jwt_required() - Validates HTTP-only cookie │ │
│ └──────────────────────────────────────────────────────┘ │
│ │ │
│ ▼ │
│ ┌──────────────────────────────────────────────────────┐ │
│ │ Asset Operations: │ │
│ │ - Create: user_id + asset data → MongoDB │ │
│ │ - Read: filter by user_id │ │
│ │ - Delete: verify user_id matches before deletion │ │
│ └──────────────────────────────────────────────────────┘ │
└───────────────────────────┬─────────────────────────────────┘
│
▼
┌─────────────────┐
│ MongoDB │
│ "assets" coll │
└─────────────────┘
Create Asset:
POST /api/assets
Content-Type: application/json
Cookie: access_token_cookie=<jwt>
{
"companyName": "Example Corp",
"domains": ["example.com", "www.example.com"],
"ipAddresses": ["192.168.1.1"],
"endpoints": ["https://api.example.com"],
"shodanKey": "optional-shodan-key",
"fofaKey": "optional-fofa-key"
}Get All User Assets:
GET /api/assets
Cookie: access_token_cookie=<jwt>
Response: [
{
"_id": "65abc123...",
"user_id": "user123",
"companyName": "Example Corp",
"domains": ["example.com"],
...
}
]Delete Asset:
DELETE /api/assets/<asset_id>
Cookie: access_token_cookie=<jwt>Main Component: Frontend/src/components/asset/AssetForm.tsx
Key Features:
- Cookie-based authentication with
credentials: 'include' - Real-time validation and error handling
- Toast notifications for user feedback
- Automatic asset list refresh after operations
- Modal view for detailed asset information
- Dark theme matching application style (#1a1f3a, #252b48)
File: Backend/main.py
Create Asset (Lines 489-540):
@app.route('/api/assets', methods=['POST'])
@jwt_required()
def create_asset():
current_user = get_jwt_identity()
data = request.get_json()
asset = {
"user_id": current_user,
"companyName": data.get('companyName'),
"domains": data.get('domains', []),
"ipAddresses": data.get('ipAddresses', []),
"endpoints": data.get('endpoints', []),
"shodanKey": data.get('shodanKey'),
"fofaKey": data.get('fofaKey'),
"created_at": datetime.utcnow(),
"updated_at": datetime.utcnow()
}
result = collection_assets.insert_one(asset)
return jsonify({"message": "Asset created", "id": str(result.inserted_id)})Get Assets (Lines 543-558):
@app.route('/api/assets', methods=['GET'])
@jwt_required()
def get_assets():
current_user = get_jwt_identity()
assets = list(collection_assets.find({"user_id": current_user}))
# Convert ObjectId to string
for asset in assets:
asset['_id'] = str(asset['_id'])
return jsonify(assets)Delete Asset (Lines 559-580):
@app.route('/api/assets/<asset_id>', methods=['DELETE'])
@jwt_required()
def delete_asset(asset_id):
current_user = get_jwt_identity()
# Verify ownership before deletion
result = collection_assets.delete_one({
"_id": ObjectId(asset_id),
"user_id": current_user
})
if result.deleted_count == 0:
return jsonify({"error": "Asset not found"}), 404
return jsonify({"message": "Asset deleted successfully"})- 🔐 JWT Authentication Required - All endpoints protected
- 🍪 HTTP-only Cookies - XSS-resistant token storage
- 👤 User Isolation - Users can only access their own assets
- ✅ Ownership Verification - DELETE operations verify user_id
- 🛡️ Input Validation - All data sanitized before storage
The application uses JWT (JSON Web Tokens) stored in HTTP-only cookies for secure authentication.
┌──────────────┐
│ Register │
│ /signup │
└──────┬───────┘
│
▼
┌──────────────────────────┐
│ Email Verification OTP │
│ /verify-email │
└──────┬───────────────────┘
│
▼
┌──────────────┐
│ Login │
│ /login │
└──────┬───────┘
│
▼
┌──────────────────────────────────────┐
│ JWT Token Set in HTTP-only Cookie │
│ Set-Cookie: access_token_cookie=... │
└──────┬───────────────────────────────┘
│
▼
┌──────────────────────────────────────┐
│ All Requests Include Cookie │
│ fetch(url, {credentials: 'include'}) │
└───────────────────────────────────────┘
| Method | Security | Auto-Management | XSS Protection |
|---|---|---|---|
| localStorage | ❌ Low | ❌ Manual | ❌ Vulnerable |
| HTTP-only Cookie | ✅ High | ✅ Automatic | ✅ Protected |
Benefits:
- Cannot be accessed by JavaScript (XSS protection)
- Automatically sent with every request
- Managed by browser
- More secure than localStorage
All API calls must include credentials: 'include':
const response = await fetch(`${API_CONFIG.baseURL}/api/assets`, {
method: 'GET',
headers: {
'Content-Type': 'application/json',
},
credentials: 'include' // Critical for cookie authentication
});Protected routes use @jwt_required() decorator:
from flask_jwt_extended import jwt_required, get_jwt_identity
@app.route('/api/protected', methods=['GET'])
@jwt_required()
def protected_route():
current_user = get_jwt_identity()
return jsonify({"user": current_user})Purpose: DNS and mail server diagnostics, checking email security configurations
Steps to obtain:
- Visit https://mxtoolbox.com/api/
- Click "Sign Up" or "Get API Key"
- Choose a plan (Free tier available with 100 requests/month)
- Complete registration
- Navigate to API dashboard to get your API key
- Copy the API key to your
.envfile
MXTOOLBOX_API_KEY=your_mxtoolbox_api_key_herePurpose: Comprehensive DNS reconnaissance and subdomain discovery
Steps to obtain:
- Visit https://dnsdumpster.com/
- Create an account or sign in
- Navigate to API settings
- Generate a new API key
- Copy to your
.envfile
DNSDUMPSTER_API_KEY=your_dnsdumpster_api_key_herePurpose: AI-powered vulnerability analysis, CVE lookup, and intelligent threat assessment
Steps to obtain:
- Visit https://cohere.ai/
- Click "Get Started" or "Sign Up"
- Complete registration (can use Google/GitHub)
- Navigate to Dashboard → API Keys
- Click "Create API Key"
- Name your key (e.g., "Vulnerability Scanner")
- Copy the API key immediately (it won't be shown again)
- Add to your
.envfile
COHERE_API_KEY=your_cohere_api_key_hereFree Tier Details:
- 100 API calls per minute
- 1000 API calls per month
- Perfect for testing and small-scale scans
Purpose: Send verification emails, scan completion notifications, and alerts
-
Enable 2-Factor Authentication on your Google account
-
Generate an App Password:
- Go to https://myaccount.google.com/security
- Click "2-Step Verification"
- Scroll down to "App passwords"
- Select "Mail" and your device
- Click "Generate"
- Copy the 16-character password
-
Add to
.env:
MAIL_SERVER=smtp.gmail.com
MAIL_PORT=587
MAIL_USE_TLS=True
MAIL_USERNAME=your_email@gmail.com
MAIL_PASSWORD=your_16_character_app_password- Outlook/Hotmail:
smtp-mail.outlook.com:587 - Yahoo:
smtp.mail.yahoo.com:587 - SendGrid:
smtp.sendgrid.net:587 - Mailgun:
smtp.mailgun.org:587
Purpose: Enable programmatic control of ZAP vulnerability scanner
Steps to configure:
- Start OWASP ZAP application
- Go to Tools → Options → API
- Either:
- Check "Enable API" and note the auto-generated key
- Or set a custom API key
- Add to
.env:
ZAP_ENABLED=true
ZAP_ADDRESS=localhost # or 'zap' for Docker
ZAP_PORT=8080
ZAP_API_KEY=your_zap_api_key_hereNote: To disable ZAP scanning, set
ZAP_ENABLED=falsein your.envfile
You need to configure environment variables in two locations:
Location: ./env (root of repository)
# ==========================================
# ROOT ENVIRONMENT FOR DOCKER-COMPOSE
# ==========================================
# Flask-Mail Configuration
MAIL_SERVER=smtp.gmail.com
MAIL_PORT=587
MAIL_USE_TLS=True
MAIL_USERNAME=your_email@gmail.com
MAIL_PASSWORD=your_app_password_here
# JWT Secret Key (generate a strong random string)
JWT_SECRET_KEY=your_super_secret_jwt_key_here_change_this
# MongoDB Connection (for Docker use 'mongo' as hostname)
MONGO_URI=mongodb://mongo:27017/subdomain_scanner
# Tools Directory (absolute path inside container)
TOOLS_DIR=/app/tools
# API Keys
MXTOOLBOX_API_KEY=your_mxtoolbox_api_key
DNSDUMPSTER_API_KEY=your_dnsdumpster_api_key
COHERE_API_KEY=your_cohere_api_key
# OWASP ZAP Configuration (optional)
ZAP_ENABLED=true
ZAP_ADDRESS=zap # Docker service name
ZAP_PORT=8080
ZAP_API_KEY=your_zap_api_keyLocation: Backend/.env
# ==========================================
# BACKEND LOCAL DEVELOPMENT ENVIRONMENT
# ==========================================
# Flask-Mail Configuration
MAIL_SERVER=smtp.gmail.com
MAIL_PORT=587
MAIL_USE_TLS=True
MAIL_USERNAME=your_email@gmail.com
MAIL_PASSWORD=your_app_password_here
# JWT Secret Key
JWT_SECRET_KEY=your_super_secret_jwt_key_here_change_this
# MongoDB Connection (local installation)
MONGO_URI=mongodb://localhost:27017/subdomain_scanner
# Tools Directory (absolute path on your system)
TOOLS_DIR=C:/path/to/your/tools # Windows
# TOOLS_DIR=/usr/local/bin # Linux/macOS
# API Keys
MXTOOLBOX_API_KEY=your_mxtoolbox_api_key
DNSDUMPSTER_API_KEY=your_dnsdumpster_api_key
COHERE_API_KEY=your_cohere_api_key
# OWASP ZAP Configuration
ZAP_ENABLED=true
ZAP_ADDRESS=localhost
ZAP_PORT=8080
ZAP_API_KEY=your_zap_api_keyLocation: Frontend/.env
The frontend now supports environment-based configuration for better security and flexibility.
# ==========================================
# FRONTEND ENVIRONMENT CONFIGURATION
# ==========================================
# Backend API URL
# For development: http://localhost:5000
# For production: https://your-production-api.com
VITE_API_URL=http://localhost:5000
# Mapbox Access Token (for IP geolocation maps)
# Get your token from: https://account.mapbox.com/access-tokens/
VITE_MAPBOX_TOKEN=your_mapbox_public_token_hereSetup Instructions:
-
Copy the example file:
cd Frontend cp .env.example .env -
Get Mapbox Token (Required for map features):
- Visit https://account.mapbox.com/
- Sign up or log in (free tier available)
- Navigate to "Access Tokens"
- Create a new public token or use default
- Copy token to
VITE_MAPBOX_TOKENin.env
-
Configure API URL:
- For local development: Keep
http://localhost:5000 - For production: Update to your deployed backend URL
- For local development: Keep
Note: After changing .env, restart the Vite dev server (npm run dev) for changes to take effect.
| Variable | Description | Example | Required |
|---|---|---|---|
MAIL_SERVER |
SMTP server hostname | smtp.gmail.com |
Yes |
MAIL_PORT |
SMTP server port | 587 |
Yes |
MAIL_USE_TLS |
Enable TLS encryption | True |
Yes |
MAIL_USERNAME |
Email address for sending | your_email@gmail.com |
Yes |
MAIL_PASSWORD |
Email password/app password | abcd efgh ijkl mnop |
Yes |
JWT_SECRET_KEY |
Secret for JWT tokens | Random string (32+ chars) | Yes |
MONGO_URI |
MongoDB connection string | mongodb://localhost:27017/db_name |
Yes |
TOOLS_DIR |
Path to security tools | /usr/local/bin or C:/tools |
Yes |
MXTOOLBOX_API_KEY |
MXToolbox API key | uuid-format-key |
Yes |
DNSDUMPSTER_API_KEY |
DNSDumpster API key | hex-string |
Yes |
COHERE_API_KEY |
Cohere AI API key | alphanumeric-key |
Yes |
ZAP_ENABLED |
Enable/disable ZAP scanning | true or false |
No |
ZAP_ADDRESS |
ZAP proxy address | localhost or zap |
If ZAP enabled |
ZAP_PORT |
ZAP proxy port | 8080 |
If ZAP enabled |
ZAP_API_KEY |
ZAP API authentication key | random-string |
If ZAP enabled |
| Frontend Variables | NEW ✨ | ||
VITE_API_URL |
Backend API endpoint URL | http://localhost:5000 |
Yes |
VITE_MAPBOX_TOKEN |
Mapbox public access token | pk.eyJ1Ijo... |
Yes (for maps) |
Use one of these methods to generate a strong JWT secret:
# Python
python -c "import secrets; print(secrets.token_urlsafe(32))"
# OpenSSL
openssl rand -base64 32
# Node.js
node -e "console.log(require('crypto').randomBytes(32).toString('base64'))"- Never commit
.envfiles - They're already in.gitignore - Use strong JWT secrets - Minimum 32 characters, random
- Use app passwords - For Gmail, generate app-specific passwords
- Rotate keys regularly - Change API keys and secrets periodically
- Limit API key permissions - Use least-privilege principle
- Use environment-specific configs - Different keys for dev/prod
The Subfinder pipeline is a modern, automated subdomain reconnaissance workflow that combines multiple ProjectDiscovery tools for comprehensive results. This section provides detailed setup instructions for first-time users.
The Subfinder pipeline is an integrated workflow that performs:
- Subfinder - Passive subdomain discovery from 50+ sources
- DNSx - Fast DNS resolution and validation
- HTTPx - HTTP/HTTPS probing and technology detection
- Nuclei - Automated vulnerability scanning (optional)
Workflow:
Domain → Subfinder → DNSx → HTTPx → Nuclei → MongoDB
(Discover) (Resolve) (Probe) (Scan) (Store)
Before installing the tools, ensure you have:
- Go 1.19+ installed (Download Go)
- Git for cloning repositories
- Linux/macOS/WSL (recommended) or Windows with PowerShell
- Administrator/sudo access for system-wide installation
# Download and install Go
wget https://go.dev/dl/go1.21.5.linux-amd64.tar.gz
sudo tar -C /usr/local -xzf go1.21.5.linux-amd64.tar.gz
# Add to PATH (add to ~/.bashrc or ~/.zshrc)
export PATH=$PATH:/usr/local/go/bin
export GOPATH=$HOME/go
export PATH=$PATH:$GOPATH/bin
# Reload shell
source ~/.bashrc # or source ~/.zshrc# Using Homebrew
brew install go
# Or download from https://golang.org/dl/# Download installer from https://golang.org/dl/
# Run the .msi installer
# Go will be automatically added to PATH
# Verify installation
go versionVerify Go installation:
go version
# Should output: go version go1.21.5 linux/amd64 (or similar)🚀 Quick Installation for Docker/Linux Users: We provide an automated script that downloads all required tools! Skip to Method 3: Automated Script if you're using Docker or Linux.
This method installs the latest versions and is best for local development.
# Install latest version
go install -v github.com/projectdiscovery/subfinder/v2/cmd/subfinder@latest
# Verify installation
subfinder -version# Install latest version
go install -v github.com/projectdiscovery/dnsx/cmd/dnsx@latest
# Verify installation
dnsx -version# Install latest version
go install -v github.com/projectdiscovery/httpx/cmd/httpx@latest
# Verify installation
httpx -version# Install latest version
go install -v github.com/projectdiscovery/nuclei/v3/cmd/nuclei@latest
# Verify installation
nuclei -version
# Update nuclei templates (important!)
nuclei -update-templatesDownload pre-compiled binaries manually if you don't have Go installed.
Method 1: Using Go (Recommended)
# Install latest version
go install -v github.com/projectdiscovery/subfinder/v2/cmd/subfinder@latest
# Verify installation
subfinder -versionMethod 2: Download Binary
# Linux
wget https://github.com/projectdiscovery/subfinder/releases/latest/download/subfinder_2.6.3_linux_amd64.zip
unzip subfinder_2.6.3_linux_amd64.zip
sudo mv subfinder /usr/local/bin/
sudo chmod +x /usr/local/bin/subfinder
# macOS
wget https://github.com/projectdiscovery/subfinder/releases/latest/download/subfinder_2.6.3_macOS_amd64.zip
unzip subfinder_2.6.3_macOS_amd64.zip
sudo mv subfinder /usr/local/bin/
sudo chmod +x /usr/local/bin/subfinderConfigure Subfinder API Keys (Optional but Recommended):
Subfinder works better with API keys for passive sources:
# Create config directory
mkdir -p ~/.config/subfinder
# Create provider-config.yaml
nano ~/.config/subfinder/provider-config.yamlAdd API keys (get free keys from respective providers):
binaryedge:
- your_binaryedge_api_key_here
censys:
- your_censys_api_id:your_censys_api_secret
shodan:
- your_shodan_api_key_here
github:
- ghp_your_github_personal_access_token_here
virustotal:
- your_virustotal_api_key_hereTest Subfinder:
subfinder -d example.com -silent
# Should return subdomains of example.comMethod 1: Using Go (Recommended)
# Install latest version
go install -v github.com/projectdiscovery/dnsx/cmd/dnsx@latest
# Verify installation
dnsx -versionMethod 2: Download Binary
# Linux
wget https://github.com/projectdiscovery/dnsx/releases/latest/download/dnsx_1.1.6_linux_amd64.zip
unzip dnsx_1.1.6_linux_amd64.zip
sudo mv dnsx /usr/local/bin/
sudo chmod +x /usr/local/bin/dnsx
# macOS
wget https://github.com/projectdiscovery/dnsx/releases/latest/download/dnsx_1.1.6_macOS_amd64.zip
unzip dnsx_1.1.6_macOS_amd64.zip
sudo mv dnsx /usr/local/bin/
sudo chmod +x /usr/local/bin/dnsxTest DNSx:
echo "example.com" | dnsx -silent
# Should resolve example.com and show IP addresses# Linux
wget https://github.com/projectdiscovery/httpx/releases/latest/download/httpx_1.3.7_linux_amd64.zip
unzip httpx_1.3.7_linux_amd64.zip
sudo mv httpx /usr/local/bin/
sudo chmod +x /usr/local/bin/httpx
# macOS
wget https://github.com/projectdiscovery/httpx/releases/latest/download/httpx_1.3.7_macOS_amd64.zip
unzip httpx_1.3.7_macOS_amd64.zip
sudo mv httpx /usr/local/bin/
sudo chmod +x /usr/local/bin/httpxTest HTTPx:
echo "https://example.com" | httpx -silent
# Should probe example.com and return HTTP details# Linux
wget https://github.com/projectdiscovery/nuclei/releases/latest/download/nuclei_3.1.5_linux_amd64.zip
unzip nuclei_3.1.5_linux_amd64.zip
sudo mv nuclei /usr/local/bin/
sudo chmod +x /usr/local/bin/nuclei
# Update templates
nuclei -update-templates
# macOS
wget https://github.com/projectdiscovery/nuclei/releases/latest/download/nuclei_3.1.5_macOS_amd64.zip
unzip nuclei_3.1.5_macOS_amd64.zip
sudo mv nuclei /usr/local/bin/
sudo chmod +x /usr/local/bin/nuclei
nuclei -update-templatesTest Nuclei:
echo "https://example.com" | nuclei -silent
# Should scan example.com with nuclei templates🎯 Best for: Docker deployments, Linux servers, first-time setup
We provide a bash script that automatically downloads all ProjectDiscovery tools (DNSx, HTTPx, Nuclei) as Linux binaries. This is the fastest and easiest method!
The script is automatically included in your Docker container. Tools are downloaded when you build the container.
No action needed! Just run:
docker-compose up --buildThe Dockerfile will execute download_tools.sh during build, and all tools will be available at /app/tools/.
If you're not using Docker, you can run the script manually:
Step 1: Navigate to Backend directory
cd Vulnerability_Scanner/BackendStep 2: Make script executable
chmod +x download_tools.shStep 3: Create tools directory
mkdir -p toolsStep 4: Run the download script
./download_tools.shWhat the script does:
- ✅ Downloads DNSx (latest stable version)
- ✅ Downloads HTTPx (latest stable version)
- ✅ Downloads Nuclei (latest stable version)
- ✅ Extracts binaries to
Backend/tools/directory - ✅ Sets executable permissions automatically
- ✅ Verifies installation
Script Output:
Downloading Linux binaries for scanning tools...
Downloading dnsx...
✓ dnsx installed
Downloading httpx...
✓ httpx installed
Downloading nuclei...
✓ nuclei installed
All tools installed successfully!
-rwxr-xr-x 1 root root 12M Jan 12 14:30 dnsx
-rwxr-xr-x 1 root root 15M Jan 12 14:30 httpx
-rwxr-xr-x 1 root root 45M Jan 12 14:30 nuclei
Step 5: Verify tools are working
# Test DNSx
./tools/dnsx -version
# Test HTTPx
./tools/httpx -version
# Test Nuclei
./tools/nuclei -version
# Update Nuclei templates
./tools/nuclei -update-templatesStep 6: Add to Backend/.env
# Add this line to your .env file
echo 'TOOLS_DIR=/app/tools' >> .env#!/bin/bash
# Download Linux binaries for ProjectDiscovery tools
echo "Downloading Linux binaries for scanning tools..."
cd /app/tools || exit 1
# Download dnsx
echo "Downloading dnsx..."
curl -L https://github.com/projectdiscovery/dnsx/releases/download/v1.2.3/dnsx_1.2.3_linux_amd64.zip -o dnsx.zip
unzip -o dnsx.zip
chmod +x dnsx
rm dnsx.zip
echo "✓ dnsx installed"
# Download httpx
echo "Downloading httpx..."
curl -L https://github.com/projectdiscovery/httpx/releases/download/v1.3.7/httpx_1.3.7_linux_amd64.zip -o httpx.zip
unzip -o httpx.zip
chmod +x httpx
rm httpx.zip
echo "✓ httpx installed"
# Download nuclei
echo "Downloading nuclei..."
curl -L https://github.com/projectdiscovery/nuclei/releases/download/v3.1.5/nuclei_3.1.5_linux_amd64.zip -o nuclei.zip
unzip -o nuclei.zip
chmod +x nuclei
rm nuclei.zip
echo "✓ nuclei installed"
echo ""
echo "All tools installed successfully!"
ls -lh /app/tools/Problem: "Permission denied" when running script
# Solution: Make it executable
chmod +x download_tools.sh
./download_tools.shProblem: "curl: command not found"
# Solution: Install curl
# Ubuntu/Debian:
sudo apt update && sudo apt install curl unzip
# CentOS/RHEL:
sudo yum install curl unzipProblem: "tools directory not found"
# Solution: Create the directory
mkdir -p Backend/tools
cd Backend
./download_tools.shProblem: "Failed to download"
# Solution: Check internet connection and try again
# Or download manually:
cd Backend/tools
# Download DNSx manually
wget https://github.com/projectdiscovery/dnsx/releases/download/v1.2.3/dnsx_1.2.3_linux_amd64.zip
unzip dnsx_1.2.3_linux_amd64.zip
chmod +x dnsx
# Repeat for httpx and nucleiTo update to the latest versions, simply edit the script to change version numbers and re-run:
- Open
download_tools.sh - Update version numbers in URLs (check GitHub releases)
- Run:
./download_tools.sh - Tools will be re-downloaded with latest versions
| Method | Best For | Pros | Cons | Time |
|---|---|---|---|---|
| Method 1: Go Install | Local development, frequent updates | Latest versions, easy updates via go install |
Requires Go, slower first install | ~5 min |
| Method 2: Manual Binary | Specific version control, no Go | No Go required, specific versions | Manual updates, platform-specific | ~10 min |
| Method 3: Automated Script | Docker, Linux servers, quick setup | Fastest, no Go needed, automated | Linux only, fixed versions in script | ~2 min |
Recommendation:
- 🐳 Docker users: Method 3 (automatic during build)
- 🐧 Linux/WSL users: Method 3 (run script manually)
- 💻 Local development: Method 1 (go install for easy updates)
- 🍎 macOS users: Method 1 or 2 (script is Linux-only)
Subfinder works better with API keys for passive sources:
# Create config directory
mkdir -p ~/.config/subfinder
# Create provider-config.yaml
nano ~/.config/subfinder/provider-config.yamlAdd API keys (get free keys from respective providers):
binaryedge:
- your_binaryedge_api_key_here
censys:
- your_censys_api_id:your_censys_api_secret
shodan:
- your_shodan_api_key_here
github:
- ghp_your_github_personal_access_token_here
virustotal:
- your_virustotal_api_key_hereNote: Subfinder still needs to be installed separately using Method 1 or 2 above, as it's not included in the automated script.
After using any of the above methods, you should have:
- ✅ Subfinder - For subdomain discovery
- ✅ DNSx - For DNS resolution
- ✅ HTTPx - For HTTP probing
- ✅ Nuclei - For vulnerability scanning
Quick verification:
# Check all tools are installed
which subfinder && which dnsx && which httpx && which nuclei
# Or if using script method:
ls Backend/tools/
# Should show: dnsx, httpx, nuclei
# Test each tool
subfinder -version
dnsx -version
httpx -version
nuclei -versionRun this comprehensive test to ensure everything works:
# Test Subfinder
echo "Testing Subfinder..."
subfinder -d example.com -silent | head -n 5
# Test DNSx
echo "Testing DNSx..."
echo "example.com" | dnsx -silent -a -resp
# Test HTTPx
echo "Testing HTTPx..."
echo "https://example.com" | httpx -silent -title -tech-detect
# Test Nuclei (if installed)
echo "Testing Nuclei..."
echo "https://example.com" | nuclei -silent -tags cve
# Test full pipeline
echo "Testing Full Pipeline..."
subfinder -d example.com -silent | dnsx -silent -a | httpx -silent -titleIf all commands work without errors, your setup is complete!
The application expects tools to be in your PATH or in a specified TOOLS_DIR.
Option 1: System PATH (Recommended)
Tools are already in PATH if installed via go install:
which subfinder
which dnsx
which httpx
which nucleiOption 2: Custom Tools Directory
If you prefer a custom directory:
# Create tools directory
mkdir -p ~/security-tools/bin
# Move or symlink tools
ln -s $(which subfinder) ~/security-tools/bin/
ln -s $(which dnsx) ~/security-tools/bin/
ln -s $(which httpx) ~/security-tools/bin/
ln -s $(which nuclei) ~/security-tools/bin/
# Add to .env file
echo 'TOOLS_DIR=/home/yourusername/security-tools/bin' >> Backend/.envRun this comprehensive test:
# Test Subfinder
echo "Testing Subfinder..."
subfinder -d example.com -silent | head -n 5
# Test DNSx
echo "Testing DNSx..."
echo "example.com" | dnsx -silent -a -resp
# Test HTTPx
echo "Testing HTTPx..."
echo "https://example.com" | httpx -silent -title -tech-detect
# Test Nuclei (if installed)
echo "Testing Nuclei..."
echo "https://example.com" | nuclei -silent -tags cve
# Test full pipeline
echo "Testing Full Pipeline..."
subfinder -d example.com -silent | dnsx -silent -a | httpx -silent -titleIf all commands work without errors, your setup is complete!
Ensure your Backend/.env file has the correct configuration:
# Tools Directory (if not in PATH)
TOOLS_DIR=/usr/local/bin
# Or for custom directory
# TOOLS_DIR=/home/yourusername/security-tools/bin
# Subfinder Config (optional - uncomment if you created provider-config.yaml)
# SUBFINDER_CONFIG=/home/yourusername/.config/subfinder/provider-config.yaml- Navigate to the scanner: http://localhost:3000
- Login or create an account
- Select "Subfinder" from the tool dropdown menu
- Enter target domain: e.g.,
example.com - Click "Start Scan"
- Watch real-time results stream in the terminal widget
- View results in the dashboard after scan completes
# Get your JWT token first by logging in
# Start Subfinder scan
curl -X GET "http://localhost:5000/rescan/stream_subfinder_dnsx_httpx?domain=example.com" \
-H "Cookie: access_token_cookie=YOUR_JWT_TOKEN" \
--no-buffer# Full pipeline manually
subfinder -d example.com -silent | \
dnsx -silent -a -resp | \
httpx -silent -title -tech-detect -status-code | \
tee results.txtSolution:
# Check if Go bin is in PATH
echo $PATH | grep go
# If not, add to ~/.bashrc or ~/.zshrc:
export PATH=$PATH:$HOME/go/bin
# Reload shell
source ~/.bashrc
# Or reinstall
go install -v github.com/projectdiscovery/subfinder/v2/cmd/subfinder@latestSolution:
# Make tools executable
chmod +x $(which subfinder)
chmod +x $(which dnsx)
chmod +x $(which httpx)
chmod +x $(which nuclei)Possible Causes & Solutions:
- No API keys configured: Add keys to
~/.config/subfinder/provider-config.yaml - Internet connection issues: Check your network connection
- Domain has few subdomains: Some domains genuinely have very few subdomains
- Rate limiting: Try again after some time
Debug Mode:
# Run with verbose output to see what's happening
subfinder -d example.com -vSolution:
# Increase timeout and retries
httpx -silent -timeout 10 -retry 2 -rate-limit 50
# Or edit Backend/utils/subfinder_runner.py to add these flagsSolution:
# Update templates regularly
nuclei -update-templates
# Force update if above doesn't work
nuclei -update-templates -ut
# Check template directory
ls ~/.nuclei-templates/Solution:
-
Check tool installation:
which subfinder && which dnsx && which httpx
-
Check Backend logs:
docker logs flask-backend # Or if running manually, check terminal output -
Verify .env configuration:
cat Backend/.env | grep TOOLS_DIR -
Restart backend:
docker-compose restart backend # Or Ctrl+C and restart python main.py
Edit Backend/utils/subfinder_runner.py to customize subfinder behavior:
# Example customizations:
cmd = [
'subfinder',
'-d', domain,
'-all', # Use all sources (slower but more comprehensive)
'-recursive', # Find recursive subdomains
'-timeout', '30', # Increase timeout
'-o', output_file
]# In Backend/utils/subfinder_runner.py
dnsx_cmd = [
'dnsx',
'-silent',
'-a', # Get A records
'-aaaa', # Get AAAA records
'-cname', # Get CNAME records
'-mx', # Get MX records
'-resp', # Show response
'-retry', '3',
'-rate-limit', '100' # Limit DNS queries per second
]# In Backend/utils/subfinder_runner.py
httpx_cmd = [
'httpx',
'-silent',
'-title', # Get page title
'-tech-detect', # Detect technologies
'-status-code', # Show HTTP status
'-content-length', # Show content length
'-follow-redirects', # Follow redirects
'-timeout', '10',
'-threads', '50' # Parallel threads
]# Use rate limiting to avoid getting blocked
subfinder -d example.com -rate-limit 50
# Use multiple threads for HTTPx
httpx -threads 50 -rate-limit 100
# Use custom resolvers for DNSx
dnsx -resolver 8.8.8.8,1.1.1.1,8.8.4.4 -rate-limit 100# Disable some checks
httpx -silent -title -status-code
# (Skip tech detection for speed)
# Use fewer nuclei templates
nuclei -tags cve,exposure -exclude low,info- Searches 50+ passive sources for subdomains
- Sources include:
- Certificate Transparency logs (crt.sh, Censys)
- DNS aggregators (VirusTotal, SecurityTrails, AlienVault)
- Search engines (Google, Bing, Yahoo)
- APIs (Shodan, GitHub, etc.)
- Output: List of potential subdomains
- Validates subdomains by resolving DNS
- Filters out:
- Non-existent domains
- Dead/inactive domains
- Wildcard responses
- Output: Live subdomains with IP addresses
- Probes for web servers on HTTP/HTTPS
- Detects:
- Live web services
- HTTP vs HTTPS
- Technologies used (frameworks, CMS, libraries)
- Server headers
- Page titles
- Output: Web-enabled subdomains with metadata
- Scans for vulnerabilities using 7000+ templates
- Checks for:
- Known CVEs
- Misconfigurations
- Exposed services (databases, admin panels)
- Security headers missing
- Output: Security findings with severity ratings
| Component | Config Location | Description |
|---|---|---|
| Subfinder | ~/.config/subfinder/provider-config.yaml |
API keys for data sources |
| Nuclei | ~/.nuclei-templates/ |
Vulnerability templates |
| DNSx | No config needed | Uses system DNS by default |
| HTTPx | No config needed | Uses default options |
| Backend Code | Backend/utils/subfinder_runner.py |
Pipeline orchestration |
| Backend Config | Backend/.env |
Tool paths and settings |
┌─────────────────────────────────────────────────────────────────┐
│ User Input │
│ (example.com) │
└──────────────────────────┬──────────────────────────────────────┘
│
▼
┌─────────────────────────────────────────────────────────────────┐
│ Step 1: Subfinder - Passive Subdomain Discovery │
│ • Queries 50+ sources (crt.sh, VirusTotal, Shodan, etc.) │
│ • Output: api.example.com, blog.example.com, dev.example.com │
└──────────────────────────┬──────────────────────────────────────┘
│
▼
┌─────────────────────────────────────────────────────────────────┐
│ Step 2: DNSx - DNS Resolution & Validation │
│ • Resolves each subdomain │
│ • Filters out dead/invalid entries │
│ • Output: api.example.com → 192.168.1.10 │
│ blog.example.com → 192.168.1.20 │
└──────────────────────────┬──────────────────────────────────────┘
│
▼
┌─────────────────────────────────────────────────────────────────┐
│ Step 3: HTTPx - HTTP/HTTPS Probing │
│ • Tests for web servers on each subdomain │
│ • Detects technologies (WordPress, React, Nginx, etc.) │
│ • Gets titles, status codes, headers │
│ • Output: api.example.com [200] [Nginx, Node.js] │
└──────────────────────────┬──────────────────────────────────────┘
│
▼
┌─────────────────────────────────────────────────────────────────┐
│ Step 4: Nuclei - Vulnerability Scanning (Optional) │
│ • Runs 7000+ security templates │
│ • Checks for CVEs, misconfigurations, exposures │
│ • Output: [CVE-2023-1234] Detected on api.example.com │
└──────────────────────────┬──────────────────────────────────────┘
│
▼
┌─────────────────────────────────────────────────────────────────┐
│ Step 5: MongoDB Storage │
│ • Structured JSON storage │
│ • Searchable and filterable │
│ • Historical tracking │
│ • PDF report generation │
└─────────────────────────────────────────────────────────────────┘
Subfinder Output:
api.example.com
blog.example.com
dev.example.com
admin.example.com
mail.example.com
DNSx Output:
api.example.com [192.168.1.10]
blog.example.com [192.168.1.20]
dev.example.com [192.168.1.30]
HTTPx Output:
https://api.example.com [200] [API Gateway] [Nginx, Node.js]
https://blog.example.com [200] [Example Blog] [WordPress, PHP]
https://dev.example.com [401] [Dev Server] [React, Nginx]
Final MongoDB Document:
{
"subdomain": "api.example.com",
"ip": "192.168.1.10",
"http_status": 200,
"http_title": "API Gateway",
"technologies": ["Nginx", "Node.js", "Express"],
"ssl": true,
"ssl_issuer": "Let's Encrypt",
"ports": [80, 443],
"timestamp": "2026-01-12T19:54:00Z",
"nuclei_findings": [
{
"template": "CVE-2023-1234",
"severity": "high",
"description": "Example vulnerability found"
}
]
}OWASP ZAP (Zed Attack Proxy) is a powerful vulnerability scanner integrated into this platform.
Add ZAP service to your docker-compose.yml:
services:
zap:
image: owasp/zap2docker-stable
command: zap.sh -daemon -host 0.0.0.0 -port 8080 -config api.key=your_api_key
ports:
- "8080:8080"
networks:
- scanner-networkThen:
docker-compose up zap -dWindows:
- Download installer from https://www.zaproxy.org/download/
- Run the installer
- Add ZAP to system PATH
- Launch ZAP:
zap.sh -daemon -host localhost -port 8080
Linux:
# Using apt (Debian/Ubuntu)
sudo apt update
sudo apt install zaproxy
# Using snap
sudo snap install zaproxy --classic
# From source
wget https://github.com/zaproxy/zaproxy/releases/download/v2.14.0/ZAP_2.14.0_Linux.tar.gz
tar -xvf ZAP_2.14.0_Linux.tar.gz
cd ZAP_2.14.0
./zap.sh -daemon -host localhost -port 8080macOS:
# Using Homebrew
brew install --cask owasp-zap
# Or download from website
# https://www.zaproxy.org/download/For automated scanning, start ZAP in headless daemon mode:
# Windows
zap.bat -daemon -host localhost -port 8080 -config api.key=your_api_key
# Linux/macOS
zap.sh -daemon -host localhost -port 8080 -config api.key=your_api_key-
Launch ZAP GUI (for initial setup)
zap.sh # or zap.bat on Windows -
Enable API:
- Go to
Tools→Options→API - Check "Enable API"
- Set API Key or use auto-generated one
- Note: Save the key immediately!
- Go to
-
Configure Network Settings:
- Go to
Tools→Options→Local Proxies - Ensure Address:
localhost(or0.0.0.0for Docker) - Port:
8080(default)
- Go to
-
Security Settings:
- Go to
Tools→Options→API - Uncheck "Disable the API key" (keep it enabled for security)
- Add permitted addresses if needed
- Go to
The scanner uses custom policies for different scan types:
- Spider Scan: Crawls target to map all pages
- Active Scan: Tests for vulnerabilities (can be intrusive)
- Passive Scan: Analyzes traffic without active probing
Configure scan intensity in ZAP:
Tools → Options → Active Scan → Policy
Verify ZAP is running and accessible:
# Using curl
curl "http://localhost:8080/JSON/core/view/version/?apikey=your_api_key"
# Using Python
python -c "import requests; print(requests.get('http://localhost:8080/JSON/core/view/version/?apikey=your_api_key').json())"Expected response:
{
"version": "2.14.0"
}Problem: ZAP not starting
- Solution: Check if port 8080 is already in use
# Windows netstat -ano | findstr :8080 # Linux/macOS lsof -i :8080
Problem: API key errors
- Solution: Ensure API key in
.envmatches ZAP configuration - Verify ZAP API is enabled in settings
Problem: Connection refused
- Solution: Check ZAP is running in daemon mode
- Verify firewall isn't blocking port 8080
Problem: Scans taking too long
- Solution: Adjust scan policy to "Low" intensity
- Reduce thread count in Active Scan settings
- Use smaller scope for testing
If you don't want to use ZAP scanning:
# In your .env file
ZAP_ENABLED=falseThe scanner will skip all ZAP-related vulnerability assessments.
# Start all services
docker-compose up -d
# View logs
docker-compose logs -f
# Stop all services
docker-compose down
# Rebuild after code changes
docker-compose up --buildYou need 5 terminal windows:
Terminal 1 - MongoDB:
mongod --dbpath=/path/to/data/db
# Or if running as service: sudo systemctl start mongodTerminal 2 - Redis:
redis-server
# Or if running as service: sudo systemctl start redisTerminal 3 - Backend:
cd Backend
source venv/bin/activate # or venv\Scripts\activate on Windows
python main.pyTerminal 4 - Celery Worker:
cd Backend
source venv/bin/activate
celery -A celery_worker.celery_app worker --loglevel=info --pool=soloTerminal 5 - Frontend:
cd Frontend
npm run devOptional - Celery Beat (for scheduled scans):
cd Backend
source venv/bin/activate
celery -A celery_worker.celery_app beat --loglevel=infoOptional - OWASP ZAP:
zap.sh -daemon -host localhost -port 8080 -config api.key=your_api_key- Frontend: http://localhost:3000
- Backend API: http://localhost:5000
- MongoDB: localhost:27017
- Redis: localhost:6379
- ZAP Proxy: localhost:8080
To successfully register on the platform, you need to provide:
Mandatory Fields:
- ✅ Email - Valid email address (used for login and OTP verification)
- ✅ Password - Secure password (minimum 8 characters recommended)
- ✅ Confirm Password - Must match the password field
Optional Fields (for advanced features - currently commented out in code):
- 🔑 Shodan API Key - For enhanced network scanning capabilities
- 🔑 FOFA API Key - For additional reconnaissance data
- 📁 Subdomains File - Custom subdomain wordlist (optional upload)
- 📁 Endpoints File - Known endpoints for fuzzing (optional upload)
- 📁 IPs File - Target IP addresses (optional upload)
- 📁 Naming Rules File - Custom naming conventions (optional upload)
Note: Currently, only email and password are required for registration. The optional API keys and file uploads are prepared for future features but not yet active in the registration flow.
Registration Flow:
- Navigate to http://localhost:3000/register
- Enter required fields:
- Valid email address
- Strong password
- Confirm password (must match)
- Click "Register" button
- System automatically generates:
organizationfield (extracted from email domain, e.g.,@gmail.com→gmail.com)namefield (extracted from email prefix, e.g.,user@domain.com→user)
- Email verification:
- Check your email for 6-digit OTP (One-Time Password)
- OTP expires in 10 minutes
- If SMTP not configured, auto-verification happens (check logs)
- Enter OTP on verification page
- Account activated! You can now login
Important Notes:
⚠️ Email must be unique - Cannot register with an already registered email- 🔒 Password is hashed - Stored securely using bcrypt
- 📧 Email verification required - Account won't be fully active until OTP verification
- ⏰ OTP expires in 10 minutes - Request a new OTP if expired
Login:
- Go to http://localhost:3000/login
- Enter credentials
- Access dashboard
Password Reset:
- Click "Forgot Password" on login page
- Enter email
- Check email for reset link
- Set new password
Using Knockpy Scanner:
- Log in to dashboard
- Click "New Scan" → "Knockpy Scan"
- Enter target domain (e.g.,
example.com) - Click "Start Scan"
- Monitor real-time progress
- View results when complete
Using Subfinder Scanner:
- Navigate to "Subfinder Scan"
- Enter target domain
- Configure scan options:
- Passive only (fast, safe)
- Include active enumeration (slower, more results)
- Enable certificate transparency lookup
- Start scan
- Download results as JSON/PDF
Scan Results Include:
- All discovered subdomains
- IP addresses and geolocation
- HTTP status codes
- Open ports and services
- Technology stack
- SSL/TLS certificates
- DNS records
- Security headers
- Vulnerability findings (if ZAP enabled)
Automated ZAP Scanning: When ZAP is enabled, each subdomain automatically goes through:
- Spider Scan - Crawls all pages and resources
- Passive Scan - Analyzes traffic for vulnerabilities
- Active Scan - Tests for security issues (optional)
Results categorized by:
- 🔴 Critical (CVSS 9.0-10.0)
- 🟠 High (CVSS 7.0-8.9)
- 🟡 Medium (CVSS 4.0-6.9)
- 🔵 Low (CVSS 0.1-3.9)
- ⚪ Informational
Cohere Integration:
- Automatic CVE analysis for discovered vulnerabilities
- Natural language explanations of security issues
- Remediation recommendations
- Risk assessment and prioritization
- Exploitability scoring
Usage:
- Complete a vulnerability scan
- Click on any vulnerability finding
- View AI-generated analysis
- Get remediation steps
- Export findings to PDF
Analytics Available:
Scan History:
- Total scans performed
- Success/failure rate
- Average scan duration
- Most scanned domains
Subdomain Trends:
- Subdomain count over time
- New vs. existing subdomains
- Subdomain growth rate
- Geographic distribution
Vulnerability Metrics:
- Total vulnerabilities found
- Severity distribution
- Most common vulnerabilities
- Remediation progress tracking
Service Analysis:
- Open ports distribution
- Running services detected
- Technology stack analysis
- Certificate expiration tracking
Access Dashboard:
http://localhost:3000/dashboard/statistics
Generate Reports:
- Navigate to scan results
- Click "Download PDF Report"
- Report includes:
- Executive Summary
- Scan metadata (date, duration, target)
- Subdomain inventory
- Port and service details
- Technology fingerprints
- Vulnerability findings
- CVE details with CVSS scores
- Remediation recommendations
- Appendices and references
Report Customization:
- Include/exclude sections
- Filter by severity
- Add custom notes
- Branding options
Live Progress Tracking:
- WebSocket-based updates
- Progress percentage
- Current task being executed
- Subdomains discovered count
- Vulnerabilities found count
- Estimated time remaining
Monitor Active Scans:
http://localhost:3000/dashboard/active-scans
Organize Your Targets:
- Create asset groups
- Tag domains by category
- Set scan schedules
- Configure scan preferences per asset
- Track asset ownership
Asset Features:
- Bulk import/export
- Custom metadata
- Scan history per asset
- Notification preferences
- Access control
Configurable Alerts:
- Scan completion
- Critical vulnerabilities found
- Scan failures
- Certificate expiration warnings
- Scheduled scan reminders
Configure Notifications:
Dashboard → Settings → Notifications
RESTful API Endpoints:
# Authentication
POST /api/auth/register
POST /api/auth/login
POST /api/auth/verify-otp
# Scanning
POST /api/scan/knockpy
POST /api/scan/subfinder
GET /api/scan/results/:scan_id
# Assets
GET /api/assets
POST /api/assets
PUT /api/assets/:id
DELETE /api/assets/:id
# Statistics
GET /api/stats/overview
GET /api/stats/trends
GET /api/stats/vulnerabilities
# Reports
GET /api/reports/:scan_id/pdf
POST /api/reports/generateAPI Documentation:
http://localhost:5000/api/docs
The AI-driven investigation loop leverages Cohere's LLM to make reconnaissance intelligent and adaptive.
graph TB
A[Initial Reconnaissance] --> B[Data Collection]
B --> C[Cohere AI Analysis]
C --> D[Generate Insights]
D --> E[Suggest Next Steps]
E --> F[Execute Safe Commands]
F --> G[Collect Results]
G --> C
G --> H[Generate Report]
-
Intelligent CVE Lookup
- Automatic vulnerability database queries
- Natural language explanations
- Severity assessment
- Remediation recommendations
-
Context-Aware Analysis
- Analyzes technology stack
- Identifies potential attack vectors
- Suggests targeted scans
- Prioritizes findings
-
Adaptive Scanning
- Learns from previous results
- Adjusts scan intensity
- Focuses on high-value targets
- Reduces false positives
-
Safety Controls
- Rate limiting (10 requests/minute default)
- Scope validation (wildcards supported)
- Command whitelist
- Human approval gates
- Audit logging
# Backend/scanner/config.py
AI_RECON_CONFIG = {
"enabled": True,
"provider": "cohere",
"model": "command",
"max_iterations": 5,
"rate_limit": "10/minute",
"scope": ["*.target.com"],
"require_approval": False,
"sandbox": True
}- Network isolation (Docker containers)
- Strict scope definitions
- Rate limiting enabled
- Command whitelisting
- Comprehensive logging
- Kill-switch mechanism
- Human oversight for production
Problem: When saving assets at /assets, page showed black screen.
Root Cause:
- Incorrect toast notification syntax (
toast({type, message})instead oftoast.success()) - Authentication not using cookies
- Missing null/undefined checks for legacy data
Solution Applied:
- Fixed toast syntax to use
toast.success()andtoast.error() - Added
credentials: 'include'to all fetch calls - Implemented data normalization:
companyName || company_name || 'Unknown Company' - Added array safety checks:
domains || [],ipAddresses || ip_addresses || []
Files Modified:
Frontend/src/components/asset/AssetForm.tsx(627 lines)
Problem: App was searching for tokens in localStorage that don't exist.
Root Cause: Backend uses HTTP-only cookies, but frontend was checking localStorage.
Solution Applied:
- Removed all localStorage token logic
- Added
credentials: 'include'to all API calls - Browser now automatically handles cookie authentication
Benefits:
- More secure (XSS-resistant)
- No manual token management
- Automatic cookie refresh
Problem: Multiple tools showing "Tool not found" or "Unknown tool" errors.
Root Cause: Tools not installed in Docker container.
Solution Applied:
Dockerfile.backend additions:
# Install from GitHub
RUN git clone --depth 1 https://github.com/sullo/nikto /opt/nikto && \
ln -s /opt/nikto/program/nikto.pl /usr/local/bin/nikto
RUN git clone --depth 1 https://github.com/sqlmapproject/sqlmap.git /opt/sqlmap && \
ln -s /opt/sqlmap/sqlmap.py /usr/local/bin/sqlmap
RUN git clone --depth 1 https://github.com/drwetter/testssl.sh.git /opt/testssl.sh && \
ln -s /opt/testssl.sh/testssl.sh /usr/local/bin/testssl.sh && \
ln -s /opt/testssl.sh/testssl.sh /usr/local/bin/testssl
# Install FFUF binary
RUN curl -L https://github.com/ffuf/ffuf/releases/download/v2.1.0/ffuf_2.1.0_linux_amd64.tar.gz | \
tar -xzf - -C /usr/local/binNew Files Created:
Backend/utils/automated_tools/sqlmap.py(69 lines)
Verification:
docker exec flask-backend nikto -Version
docker exec flask-backend sqlmap --version
docker exec flask-backend ffuf -V
docker exec flask-backend testssl.sh --versionProblem: Nmap trying to execute WSL commands inside Docker container.
Root Cause: Hardcoded command execution without environment detection.
Solution Applied:
Created environment detection in Backend/nmap.py:
def is_docker():
"""Detect if running in Docker container"""
if os.path.exists('/.dockerenv'):
return True
try:
with open('/proc/1/cgroup', 'r') as f:
return 'docker' in f.read()
except:
return False
# Conditional command building
if is_docker():
command = ["nmap", "-Pn", target] # Direct execution
elif os.name == 'nt':
command = ["wsl", "-d", "Ubuntu-22.04", "nmap", "-Pn", target] # Windows
else:
command = ["nmap", "-Pn", target] # LinuxFiles Modified:
Backend/nmap.pyBackend/ffuf.pyBackend/utils/automated_tools/wpscan.pyBackend/utils/automated_tools/env_helper.py(NEW - 45 lines)
Problem: Nmap scans timing out for large networks.
Solution Applied:
- Increased timeout from 60s to 300s (5 minutes)
- Added debug logging for timeout tracking
# In Backend/nmap.py
result = subprocess.run(
command,
capture_output=True,
text=True,
timeout=300 # Changed from 60
)Problem: Had to run docker-compose restart after every code change.
Solution Applied:
Dockerfile.backend:
ENV FLASK_ENV=development
ENV FLASK_DEBUG=1docker-compose.yml:
services:
flask-backend:
volumes:
- ./Backend:/app # Hot reload enabledBackend/main.py:
app.run(host="0.0.0.0", port=5000, debug=True)Result: Code changes now reload automatically without container restart.
Error: MongoConnectionError: Cannot connect to MongoDB
Solution:
# Check MongoDB is running in Docker
docker-compose ps
# Check MongoDB logs
docker-compose logs mongo-db
# Restart services
docker-compose down
docker-compose up -dError: ImportError: No module named 'flask'
Solution:
# Rebuild Docker images
docker-compose down
docker-compose build --no-cache flask-backend
docker-compose up -dError: Celery worker not processing tasks
Solution:
# Check Redis is running
docker-compose exec flask-backend redis-cli ping
# Should return "PONG"
# Check Celery worker logs
docker-compose logs -f flask-backend
# Restart Celery (included in flask-backend container)
docker-compose restart flask-backendError: npm ERR! Cannot find module
Solution:
cd Frontend
rm -rf node_modules package-lock.json
npm install
npm run devError: CORS policy blocked
Solution:
Verify backend CORS in Backend/main.py:
CORS(app, origins=["http://localhost:3000"], supports_credentials=True)Error: FFUF results showing "N/A"
Potential Cause: Missing wordlist file.
Solution:
# Check if wordlist exists
docker exec flask-backend ls -la /app/tools/wordlists/
# Download common wordlist
docker exec flask-backend wget \
https://raw.githubusercontent.com/danielmiessler/SecLists/master/Discovery/Web-Content/common.txt \
-O /app/tools/wordlists/common.txt
# Or mount your wordlists
# In docker-compose.yml:
# volumes:
# - ./Backend:/app
# - ./wordlists:/app/tools/wordlistsError: Tool not found for custom tools
Solution:
# Verify tool is installed
docker exec flask-backend which <tool-name>
# Check environment detection
docker exec flask-backend python3 -c "from Backend.utils.automated_tools.env_helper import is_docker; print(f'Docker: {is_docker()}')"
# Manually install missing tool
docker exec -it flask-backend bash
apt update && apt install <tool-name>Error: SMTPAuthenticationError
Solution:
- Verify Gmail App Password (not regular password)
- Enable 2FA and generate App Password: https://myaccount.google.com/apppasswords
- Check SMTP settings in
Backend/.env:MAIL_USERNAME=your-email@gmail.com MAIL_PASSWORD=your-app-password - Verify firewall allows port 587
Enable Debug Logging:
Backend:
# Backend/main.py
app.config['DEBUG'] = TrueCelery:
celery -A celery_worker.celery_app worker --loglevel=debugFrontend:
npm run dev -- --debugCheck Logs:
# Docker logs
docker-compose logs -f backend
docker-compose logs -f frontend
# System logs
tail -f Backend/logs/app.log
tail -f /var/log/mongodb/mongod.logDatabase Inspection:
# Connect to MongoDB
mongosh
use subdomain_scanner
db.scans.find().pretty()
db.users.find().pretty()❌ NEVER:
- Scan targets without written permission
- Test production systems without approval
- Share vulnerability findings publicly before disclosure
- Use the tool for illegal purposes
✅ ALWAYS:
- Obtain explicit authorization
- Document permission (contracts, emails)
- Follow responsible disclosure practices
- Respect bug bounty program rules
- Comply with local laws
# Rotate keys periodically
# Every 90 days recommended
# Use environment-specific keys
# prod.env, staging.env, dev.env
# Never commit secrets
git add .env # ❌ NEVER!
# Use secret managers for production
# AWS Secrets Manager, HashiCorp Vault, etc.# Implement aggressive rate limiting
RATE_LIMITS = {
"subdomain_enum": "100/hour",
"port_scan": "50/hour",
"vuln_scan": "10/hour",
"api_requests": "1000/day"
}# Use VPN for scanning
# Prevents IP exposure
# Implement egress filtering
# Block outbound to unintended targets
# Use proxy chains
# Add anonymity layer# Encrypt sensitive data at rest
from cryptography.fernet import Fernet
# Encrypt API keys in database
# Hash passwords (never store plaintext)
# Sanitize user inputs
# Implement access controls# Log all critical operations
import logging
logging.info(f"User {user_id} started scan on {target}")
logging.warning(f"Critical vulnerability found: {vuln}")
logging.error(f"Scan failed: {error}")Have a plan for:
- Accidental unauthorized scans
- API key compromise
- Data breaches
- Service abuse
- Legal inquiries
# Use HTTPS only
# Implement WAF
# Regular security updates
# Backup data regularly
# Monitor for anomalies
# Implement intrusion detectionWe welcome contributions from the community!
-
Fork the Repository
# Click "Fork" on GitHub git clone https://github.com/YOUR_USERNAME/Vulnerability_Scanner.git -
Create Feature Branch
git checkout -b feature/amazing-feature
-
Make Changes
- Write clean, documented code
- Follow existing code style
- Add tests for new features
- Update documentation
-
Test Thoroughly
# Run tests python -m pytest npm test
-
Commit Changes
git add . git commit -m "feat: add amazing feature"
-
Push to Fork
git push origin feature/amazing-feature
-
Create Pull Request
- Go to original repository
- Click "New Pull Request"
- Describe your changes
- Link related issues
Code Style:
- Python: Follow PEP 8
- TypeScript/React: Follow Airbnb style guide
- Use meaningful variable names
- Comment complex logic
- Write self-documenting code
Commit Messages:
feat: add new feature
fix: fix bug
docs: update documentation
style: format code
refactor: refactor code
test: add tests
chore: update dependenciesPull Request Template:
## Description
Brief description of changes
## Type of Change
- [ ] Bug fix
- [ ] New feature
- [ ] Breaking change
- [ ] Documentation update
## Testing
Describe testing performed
## Screenshots
If applicable, add screenshots- 🐛 Bug fixes
- ✨ New features
- 📝 Documentation improvements
- 🧪 Additional tests
- 🎨 UI/UX enhancements
- 🔧 Performance optimizations
- 🌍 Internationalization
- 🔌 New tool integrations
- Be respectful and inclusive
- Provide constructive feedback
- Focus on the code, not the person
- Help newcomers learn
- Follow ethical security practices
This project is licensed under the MIT License - see the LICENSE file for details.
MIT License
Copyright (c) 2026 Vulnerability Scanner Project
Permission is hereby granted, free of charge, to any person obtaining a copy
of this software and associated documentation files (the "Software"), to deal
in the Software without restriction, including without limitation the rights
to use, copy, modify, merge, publish, distribute, sublicense, and/or sell
copies of the Software, and to permit persons to whom the Software is
furnished to do so, subject to the following conditions:
The above copyright notice and this permission notice shall be included in all
copies or substantial portions of the Software.
- 📖 Documentation: Read this README thoroughly
- 🐛 Bug Reports: Create an issue
- 💡 Feature Requests: Create an issue
- 💬 Discussions: GitHub Discussions
- ⭐ Star this repository
- 🍴 Fork and contribute
- 📢 Share with security community
- 🐦 Follow for updates
This project integrates and builds upon amazing open-source tools:
- OWASP ZAP - Web application security scanner
- Nmap - Network exploration and security auditing
- Subfinder - Subdomain discovery by ProjectDiscovery
- HTTPx - HTTP toolkit by ProjectDiscovery
- DNSx - DNS toolkit by ProjectDiscovery
- Knockpy - DNS reconnaissance
- WhatWeb - Web fingerprinting
- Nikto - Web vulnerability scanner
- FFuF - Web fuzzer
- testssl.sh - TLS/SSL testing
- WPScan - WordPress security scanner
- Cohere AI - Language model for intelligent analysis
- Flask - Python web framework
- React - Frontend library
- MongoDB - Database
- Redis - Message broker
- Celery - Distributed task queue
Special thanks to the entire security research community!
This tool is provided for educational and authorized security testing purposes only.
Important:
- 🚫 Unauthorized scanning is illegal
⚠️ Always obtain written permission- 📋 Follow responsible disclosure
- 🔒 Respect privacy and data protection laws
- ⚖️ Comply with local regulations
- 🎯 Use ethically and responsibly
The authors and contributors are not responsible for misuse of this tool. Users are solely responsible for ensuring legal compliance.
Made with ❤️ for the security community
⭐ Star this repository if you find it helpful!
Last Updated: January 12, 2026