Investigator-grade digital forensics platform for acquiring, preserving, and analyzing web and local evidence with an immutable chain of custody.
The Forensic Evidence Acquisition System (FEAS) is a secure, full-stack solution designed for law enforcement and digital forensic investigators. It automates the acquisition of evidence from social media URLs and local files, ensuring strict integrity through SHA-256 hashing and automated PDF reporting.
Unlike standard downloaders, FEAS maintains a legally admissible Chain of Custody log for every action taken on a piece of evidence, from the moment of acquisition to final storage.
- π Secure User Authentication
- NEW: Automatic database initialization on first run
- NEW: Default admin user created from environment variables
- NEW: Bcrypt password hashing for maximum security
- JWT token-based authentication with configurable expiration
- User registration and login system
- User profile management with editable information
- Role-based access control (Admin vs Analyst roles)
- π See AUTHENTICATION_GUIDE.md for details
- π Universal Acquisition
- Capture videos and metadata from Twitter (X), YouTube, and direct URLs.
- Secure Local File Upload for existing evidence.
- π Evidence Integrity
- Automated SHA-256 Hashing upon acquisition.
- Verify Integrity tools to detect file tampering.
- βοΈ Chain of Custody
- Immutable, append-only logs for every event (Acquisition, Hashing, Storage, Access, Verification).
- Full audit trail exportable in reports.
- Stored in both database (PostgreSQL) and file-based log.
- π Deep Metadata Extraction
- Extracts EXIF data, video codecs, bitrates, duration, resolution, and platform-specific metadata.
- Uses
ffmpegfor video analysis andexifreadfor image metadata. - MIME type detection via
python-magic.
- π Real-time Analytics Dashboard
- Live statistics and metrics for all forensic operations.
- Period-based analytics (24h, 7d, 30d, 90d).
- Success/failure rate tracking and performance metrics.
- π Automated Reporting
- Generates professional PDF Forensic Reports containing all case details, hashes, and custody logs.
- β‘ Real-time Monitoring
- Live job tracking with React Query polling.
- Background processing with Celery Worker and Celery Beat scheduler.
- Progress tracking through multiple stages (pending, downloading, processing, hashing, extracting metadata, generating report, completed).
- Twitter (X) -
twitter.com,x.com - YouTube -
youtube.com,youtu.be
Supported file types:
- Images: JPEG (
.jpg,.jpeg), PNG (.png), HEIC/HEIF (.heic,.heif) - Videos: MP4 (
.mp4), QuickTime (.mov), AVI (.avi) - Audio: MP3 (
.mp3), WAV (.wav) - Documents: PDF, text files, and archives (ZIP)
Maximum file size: 500 MB (configurable via MAX_FILE_SIZE)
- π Python 3.11 & FastAPI 0.115 - High-performance async API framework.
- ποΈ PostgreSQL 15 & SQLAlchemy 2.0 - Robust relational database with ORM.
- β‘ Celery 5.3 & Redis 7 - Distributed task queue with Celery Beat scheduler.
- π΅οΈ Forensic Tools:
yt-dlp- Video download from social platformsffmpeg- Media processing and metadata extractionexifread- EXIF data extractionpython-magic- File type detection
- π ReportLab - Dynamic PDF forensic report generation.
- π Playwright - Web scraping and browser automation.
- π Pydantic 2.9 - Data validation and settings management.
- π python-jose & passlib - JWT tokens and password hashing.
- βοΈ React 18.2 - Modern UI library with hooks.
- π Styled Components 6.1 - Component-based theming (Cyber/Dark/Light themes).
- π‘ React Query 3.39 & Axios 1.6 - Efficient data fetching and caching.
- π¨ React Icons 4.12 - Visual indicators for file types and status.
- π Framer Motion 10.16 - Smooth animations.
- π€ React Dropzone 14.2 - Drag-and-drop file uploads.
- π React Toastify 10.0 - User notifications.
- π― Zustand 4.4 - Lightweight state management.
- π³ Docker & Docker Compose - Multi-container orchestration.
- π¦ PostgreSQL, Redis, Backend, Frontend, Celery Worker, Celery Beat - 6 containerized services.
- Docker & Docker Compose (Recommended)
- OR Python 3.11+ and Node.js 18+ (for manual setup)
π For detailed setup instructions, see QUICKSTART.md
-
Clone the repository
git clone https://github.com/Dynamo2k/FEAS.git cd FEAS -
Configure Environment
cd backend cp .env.example .env # Edit .env and change default admin credentials! # DEFAULT_ADMIN_EMAIL=admin@feas.local # DEFAULT_ADMIN_PASSWORD=change-this-password
-
Launch All Services
# From the backend directory docker-compose up --build -dThis starts 6 services:
- PostgreSQL (port 5432) - Database
- Redis (port 6379) - Message broker
- Backend API (port 8000) - FastAPI server
- Frontend (port 3000) - React application
- Celery Worker - Background task processor
- Celery Beat - Scheduled task scheduler
-
Access the Application
- Frontend Dashboard:
http://localhost:3000 - API Documentation:
http://localhost:8000/docs - API Health Check:
http://localhost:8000/health
- Frontend Dashboard:
-
Login with Default Admin
- Email:
admin@feas.local(or your configured email) - Password:
admin123(or your configured password) β οΈ Change the password immediately after first login!
- Email:
cd backend
python -m venv venv
source venv/bin/activate # or venv\Scripts\activate on Windows
pip install -r requirements.txt
# Create .env file
cp .env.example .env
# Edit .env with your configuration
# For quick testing with SQLite:
export USE_SQLITE=true # or set USE_SQLITE=true in .env
# Install system dependencies (Linux/Mac)
# ffmpeg, libmagic1 are required
sudo apt-get install ffmpeg libmagic1 # Ubuntu/Debian
# brew install ffmpeg libmagic # macOS
# Start the backend (database tables created automatically on first run)
uvicorn app.main:app --reload --host 0.0.0.0 --port 8000
# In separate terminals, start Celery:
celery -A app.workers.celery_app.celery worker --loglevel=info
celery -A app.workers.celery_app.celery beat --loglevel=infoπ For PostgreSQL setup, see POSTGRESQL_SETUP.md
cd frontend
npm install
npm startThe React app will open at http://localhost:3000.
FEAS/
βββ backend/ # FastAPI Backend
β βββ app/
β β βββ api/v1/endpoints/ # REST API routes
β β β βββ jobs.py # Job submission & monitoring
β β β βββ auth.py # Authentication (login/register)
β β β βββ dashboard.py # Analytics & statistics
β β β βββ profile.py # User profile management
β β β βββ social.py # Social media links
β β β βββ links.py # Link management
β β β βββ health.py # Health check endpoint
β β βββ core/ # Core utilities
β β β βββ config.py # Pydantic settings
β β β βββ logger.py # Forensic logging
β β β βββ security.py # Authentication & authorization
β β βββ db/ # Database
β β β βββ base.py # SQLAlchemy base
β β β βββ session.py # DB session management
β β βββ models/ # Data models
β β β βββ schemas.py # Pydantic schemas (API)
β β β βββ sql_models.py # SQLAlchemy models (DB)
β β β βββ enums.py # Enumerations
β β βββ pipelines/ # Processing pipelines
β β β βββ url_pipeline.py # URL acquisition
β β β βββ upload_pipeline.py # File upload processing
β β β βββ unified_pipeline.py # Unified forensic flow
β β βββ services/ # Business logic
β β β βββ downloader.py # yt-dlp wrapper
β β β βββ metadata.py # EXIF/ffmpeg extraction
β β β βββ hashing.py # SHA-256 hashing
β β β βββ chain_of_custody.py # Audit logging
β β β βββ pdf_generator.py # ReportLab PDF creation
β β β βββ pdf_service.py # Playwright PDF service
β β β βββ validator.py # File validation
β β β βββ storage.py # Storage abstraction
β β βββ storage/ # Storage backends
β β β βββ local_storage.py # Local filesystem
β β β βββ s3_storage.py # AWS S3 (optional)
β β βββ workers/ # Celery tasks
β β β βββ celery_app.py # Celery configuration
β β β βββ tasks.py # Async job tasks
β β βββ main.py # FastAPI application entry
β βββ Dockerfile # Backend container
β βββ docker-compose.yml # Multi-service orchestration
β βββ requirements.txt # Python dependencies
β
βββ frontend/ # React Frontend
β βββ src/
β β βββ components/
β β β βββ common/ # Reusable components
β β β β βββ LoadingSpinner.jsx
β β β β βββ ThemeSwitcher.jsx
β β β β βββ ErrorBoundary.jsx
β β β βββ evidence/ # Evidence display
β β β β βββ MediaPreview.jsx
β β β β βββ MetadataTable.jsx
β β β β βββ SHA256Display.jsx
β β β β βββ VerifyIntegrityButton.jsx
β β β βββ layout/ # Layout components
β β β β βββ Header.jsx
β β β β βββ Sidebar.jsx
β β β β βββ Footer.jsx
β β β β βββ Layout.jsx
β β β βββ monitoring/ # Job monitoring
β β β β βββ JobMonitorTable.jsx
β β β βββ submission/ # Evidence submission
β β β βββ URLInput.jsx
β β β βββ FileUpload.jsx
β β β βββ SubmissionTabs.jsx
β β βββ pages/ # Page components
β β β βββ Dashboard.jsx
β β β βββ SubmissionPage.jsx
β β β βββ JobMonitorPage.jsx
β β β βββ EvidenceDetailPage.jsx
β β β βββ SettingsPage.jsx
β β β βββ LoginPage.jsx
β β β βββ RegisterPage.jsx
β β β βββ ProfilePage.jsx
β β β βββ AnalyticsPage.jsx
β β β βββ PlaceholderPage.jsx
β β βββ services/ # API & utilities
β β β βββ api.js # Axios instance
β β β βββ validation.js # Form validation
β β β βββ theme.js # Theme helpers
β β βββ store/ # State management
β β β βββ jobStore.js # Zustand job state
β β β βββ themeStore.js # Zustand theme state
β β β βββ authStore.js # Zustand auth state
β β βββ styles/ # Global styles
β β β βββ GlobalStyles.js
β β β βββ theme.js # Theme definitions
β β β βββ components.css
β β βββ App.jsx # Main app component
β β βββ index.js # React entry point
β βββ package.json # Node dependencies
β βββ public/ # Static assets
β
βββ README.md # This file
βββββββββββββββ
β Client β (React Frontend)
β Browser β
ββββββββ¬βββββββ
β HTTP/REST
βΌ
βββββββββββββββββββββββββββββββββββββββββββ
β FastAPI Backend (Port 8000) β
β βββββββββββββββββββββββββββββββββββ β
β β API Endpoints (v1) β β
β β /jobs/url, /jobs/upload β β
β β /jobs/{id}, /dashboard β β
β ββββββββββββ¬βββββββββββββββββββββββ β
β β β
β ββββββββββββΌβββββββββββββββββββββββ β
β β Pipelines Layer β β
β β β’ URL Pipeline (yt-dlp) β β
β β β’ Upload Pipeline β β
β β β’ Unified Forensic Pipeline β β
β ββββββββββββ¬βββββββββββββββββββββββ β
β β β
β ββββββββββββΌβββββββββββββββββββββββ β
β β Services Layer β β
β β β’ Downloader (yt-dlp) β β
β β β’ Metadata (ffmpeg, exifread) β β
β β β’ Hashing (SHA-256) β β
β β β’ Chain of Custody Logger β β
β β β’ PDF Generator (ReportLab) β β
β β β’ Storage (Local/S3) β β
β ββββββββββββ¬βββββββββββββββββββββββ β
βββββββββββββββΌβββββββββββββββββββββββββββ
β
ββββββ΄βββββ
βΌ βΌ
βββββββββββ ββββββββββββββββ
β Redis β β PostgreSQL β
β (Cache) β β (Database) β
ββββββ¬βββββ ββββββββββββββββ
β
ββββββΌβββββββββββββββββββββ
β Celery Workers β
β β’ process_url_job β
β β’ process_upload_job β
βββββββββββββββββββββββββββ
βββββββββββββββββββββββββββ
β Celery Beat β
β (Scheduled Tasks) β
βββββββββββββββββββββββββββ
- FastAPI Backend: RESTful API server handling all forensic operations
- React Frontend: Modern SPA with three theme options (Cyber/Dark/Light)
- PostgreSQL: Stores job metadata, chain of custody, and user profiles
- Redis: Message broker for Celery and caching
- Celery Worker: Processes heavy tasks asynchronously (downloads, metadata extraction, PDF generation)
- Celery Beat: Scheduler for periodic tasks (cleanup, monitoring)
| Method | Endpoint | Description |
|---|---|---|
POST |
/api/v1/jobs/url |
Submit URL for evidence acquisition (Twitter/X, YouTube) |
POST |
/api/v1/jobs/upload |
Upload local file as evidence |
GET |
/api/v1/jobs |
List all jobs |
GET |
/api/v1/jobs/{job_id}/status |
Get detailed job status |
GET |
/api/v1/jobs/{job_id}/details |
Get detailed job metadata and chain of custody |
POST |
/api/v1/jobs/{job_id}/verify |
Verify file integrity (SHA-256) |
GET |
/api/v1/jobs/{job_id}/report |
Generate and download PDF forensic report |
GET |
/api/v1/analytics |
Get analytics data (total jobs, completed, failed, etc.) |
| Method | Endpoint | Description |
|---|---|---|
POST |
/api/v1/auth/register |
Register new user account |
POST |
/api/v1/auth/login |
Login and receive JWT token |
GET |
/api/v1/auth/me |
Get current authenticated user |
POST |
/api/v1/auth/logout |
Logout (client-side token removal) |
| Method | Endpoint | Description |
|---|---|---|
GET |
/api/v1/dashboard/cards |
Get dashboard statistics cards |
GET |
/api/v1/dashboard/activity |
Get recent chain of custody events |
| Method | Endpoint | Description |
|---|---|---|
GET |
/api/v1/profile/ |
Get user profile information |
PATCH |
/api/v1/profile/ |
Update profile information |
| Method | Endpoint | Description |
|---|---|---|
GET |
/social |
Get social media links |
POST |
/social |
Add new social link |
DELETE |
/social/{id} |
Delete social link |
| Method | Endpoint | Description |
|---|---|---|
GET |
/health |
API health check |
GET |
/ |
API version info |
| Dashboard | Evidence Details |
|---|---|
| Real-time monitoring of all forensic jobs | Deep dive into metadata and custody logs |
Create a .env file in the backend/ directory:
# Database
POSTGRES_SERVER=localhost
POSTGRES_USER=forensic
POSTGRES_PASSWORD=password
POSTGRES_DB=forensic_db
POSTGRES_PORT=5432
DATABASE_URL=postgresql://forensic:password@localhost:5432/forensic_db
# Redis
REDIS_HOST=localhost
REDIS_PORT=6379
# Celery Configuration
# Set USE_CELERY=false to use FastAPI BackgroundTasks instead of Celery
# Useful for simple development setups without Redis/Celery
USE_CELERY=true
# Storage
STORAGE_TYPE=local # or 's3'
LOCAL_STORAGE_PATH=./evidence_storage
MAX_FILE_SIZE=524288000 # 500MB in bytes
# S3 (Optional)
S3_ENDPOINT=https://s3.amazonaws.com
S3_ACCESS_KEY=your_access_key
S3_SECRET_KEY=your_secret_key
S3_BUCKET_NAME=forensic-evidence
S3_REGION=us-east-1
# Security
SECRET_KEY=your-secret-key-change-in-production
ACCESS_TOKEN_EXPIRE_MINUTES=11520 # 8 days
# Logging
LOG_LEVEL=INFO
CHAIN_OF_CUSTODY_LOG_PATH=./chain_of_custody.log
# Rate Limiting
RATE_LIMIT_PER_MINUTE=60
# Allowed Domains for URL Acquisition
ALLOWED_URL_DOMAINS=["twitter.com","x.com","youtube.com","youtu.be","facebook.com","fb.watch","fb.com","instagram.com"]π For URL acquisition setup details, see URL_SETUP.md
The frontend uses environment variables prefixed with REACT_APP_:
REACT_APP_API_URL=http://localhost:8000Currently, the project does not include automated tests. Manual testing is performed through:
- API Testing: Use FastAPI's built-in Swagger UI at
http://localhost:8000/docs - Frontend Testing: Manual UI testing in the React app
- Integration Testing: End-to-end workflow testing with real URL submissions and file uploads
Error: docker-compose: command not found
Solution:
- Install Docker Compose:
sudo apt-get install docker-compose(Linux) - Or use Docker Compose V2:
docker compose up(instead ofdocker-compose up)
Error: Bind for 0.0.0.0:8000 failed: port is already allocated
Solution:
# Find and kill process using the port
sudo lsof -i :8000
sudo kill -9 <PID>Error: sqlalchemy.exc.OperationalError: could not connect to server
Solution:
- Ensure PostgreSQL is running:
docker-compose ps - Check database credentials in
.env - Wait a few seconds for PostgreSQL to fully initialize
Error: Jobs stuck in "pending" status with 0% progress
Solutions:
-
If using Celery mode (
USE_CELERY=true):# Check Celery worker logs docker-compose logs celery-worker # Restart Celery services docker-compose restart celery-worker celery-beat # Verify Redis is running redis-cli ping # Should return PONG
-
Switch to BackgroundTasks mode (simpler for development):
- Set
USE_CELERY=falsein your.envfile - Restart the backend server
- Jobs will process in-process without requiring Redis/Celery
- Set
Error: Network Error or CORS errors in browser console
Solution:
- Verify backend is running:
curl http://localhost:8000/health - Check
REACT_APP_API_URLin frontend environment - Ensure CORS is properly configured in
backend/app/main.py
Error: yt-dlp error or download timeout
Solution:
- Update yt-dlp:
pip install --upgrade yt-dlp - Check if URL is from supported platform (Twitter/X, YouTube)
- Verify internet connectivity from Docker container
Error: Playwright or ReportLab errors
Solution:
# Install Playwright browsers (if manual setup)
playwright install chromium
# For Docker, rebuild the container
docker-compose build backendError: Permission denied: './evidence_storage'
Solution:
# Create storage directory with proper permissions
mkdir -p backend/evidence_storage
chmod 777 backend/evidence_storage
# Or use Docker volumes (already configured in docker-compose.yml)- WebSocket support for real-time job updates
- Instagram and Facebook evidence acquisition
- Multi-user authentication and role-based access control
- Advanced search and filtering in evidence database
- Export chain of custody as blockchain records
- Mobile app for field evidence collection
- Automated testing suite (unit, integration, E2E)
- Cloud deployment guides (AWS, Azure, GCP)
- Evidence comparison and deduplication
- Machine learning for content classification
A huge thanks to the team that made this project possible:
- π¨βπ» Rana Uzair Ahmad - Dynamo2k1
- π¨βπ» Muhammad Usman - Prof.Paradox
- π©βπ» Hoor ul Ain - hurrainjhl
- π©βπ» Umae Habiba - ZUNATIC
- π¨βπ» Bilal Badar - devdas36
This project is licensed under the MIT License..
Disclaimer: This software is intended for authorized forensic investigations only. Ensure compliance with all local laws regarding data privacy and evidence handling. The developers are not responsible for any misuse of this software.