Clone any website to your local machine. Fast and simple.
Quick Start · Features · Installation
# Install
go install github.com/eyverg8/cpweb/cmd/cpweb@latest
# Clone a website
cpweb https://example.com- Fast crawling - Download entire websites in seconds
- Complete assets - Get HTML, CSS, JS, images, and other files
- Link preservation - Maintain site structure for offline browsing
- Parallel cloning - Clone multiple sites simultaneously with worker pool
- URL file input - Clone multiple sites from a text file (one URL per line)
- Duplicate detection - Automatically skips duplicate URLs
- Claude Code Skill - Use
/cpwebdirectly in Claude - Cross-platform - Works on Windows, macOS, and Linux
- Offline browsing - Browse cloned sites without internet
Windows (Automatic Installer - Recommended):
# Download the repo and run as Administrator:
cd cpweb
install-go.batWindows (Manual - if automatic fails):
- Download from https://go.dev/dl/
- Run the
.msiinstaller - Go will be installed to
C:\Program Files\Go\ - Add to PATH:
%USERPROFILE%\go\bin
macOS:
# Using Homebrew
brew install go
# Or download from https://go.dev/dl/Linux:
# Ubuntu/Debian
sudo apt update && sudo apt install golang-go
# Fedora
sudo dnf install golang
# Arch
sudo pacman -S goVerify installation:
go versiongo install github.com/eyverg8/cpweb/cmd/cpweb@latestAdd to PATH:
# Windows (PowerShell)
[Environment]::SetEnvironmentVariable("Path", $env:Path + ";" + (go env GOPATH) + "\bin", "User")
# macOS/Linux (add to ~/.bashrc or ~/.zshrc)
export PATH="$PATH:$(go env GOPATH)/bin"# Clone the repository
git clone https://github.com/eyverg8/cpweb.git
cd cpweb
# Build and run
go build -o cpweb cmd/cpweb/main.go
# Move to PATH (optional)
mv cpweb /usr/local/bin/# Basic clone
cpweb https://example.com
# Clone to specific directory (short syntax)
cpweb https://example.com -o C:\Users\eyver\Documents\
# Clone multiple sites in parallel
cpweb https://example.com https://another.com https://third.com -o ./downloads
# Clone from file (one URL per line)
cpweb -f urls.txt -o ./downloads
# Clone and serve locally
cpweb -s https://example.com
# Clone and open in browser (-O uppercase)
cpweb -O https://example.com
# Custom user agent
cpweb -u "Mozilla/5.0" https://example.com
# Use proxy
cpweb -p "http://proxy.example.com:8080" https://example.com
# Set cookies
cpweb -C "session=abc123" https://example.com
# Control parallel workers (default: 3)
cpweb -f urls.txt -w 5 -o ./downloads
# Serve on custom port
cpweb -s -P 8080 https://example.comCreate a text file with one URL per line:
# urls.txt
https://example.com
https://another-site.com
# Comments are supported
https://third-site.org
Flags:
-C, --cookie strings Pre-set these cookies
-f, --file string File containing URLs to clone (one per line)
-h, --help help for cpweb
-O, --open Automatically open project in default browser (uppercase O)
-o, --output string Output directory for cloned sites (default: current directory)
-p, --proxy_string string Proxy connection string (http or socks5)
-s, --serve Serve the generated files using Echo
-P, --servePort int Serve port number (default 5000)
-u, --user_agent string Custom User Agent
-w, --workers int Number of parallel workers (default 3)
Use the /cpweb command directly in Claude Code with short syntax:
# Clone to current directory
/cpweb https://example.com
# Clone to specific directory (Windows)
/cpweb https://example.com -o C:\Users\eyver\Documents\
# Clone to specific directory (Mac/Linux)
/cpweb https://example.com -o ~/Documents/
# Clone local network site
/cpweb 192.168.1.1 -o C:\Users\eyver\Documents\
# Clone multiple sites to specific folder
/cpweb https://example.com https://site2.com -o ./downloads
# Clone from file
/cpweb -f urls.txt -o ./output
# Serve locally after clone
/cpweb -s https://example.com
# Clone with custom output and serve
/cpweb -s -o ./my-site https://example.com
Note: In Claude Code, the command is automatically converted to run the cpweb CLI tool.
graph TD
A[URL Input] --> B[URL Parser]
B --> C[Web Crawler]
C --> D[Asset Extractor]
D --> E[File Writer]
E --> F[Link Restructure]
F --> G[Local Website]
cpweb/
cmd/cpweb/ # Entry point
pkg/
crawler/ # Web crawling logic
file/ # File operations
html/ # HTML processing
parser/ # URL parsing
server/ # Local server
docs/media/ # Documentation assets
MIT License - Copyright (c) 2025 eyverg8