AI Slop single-script Python script Git LFS alternative. Public files via HTTP, private files via Google Drive.
Vlfs is vibe coded by AI with only minor review. It's intended to be a throaway script. Do not use.
python vlfs.py pullContent-addressable storage: SHA256 hashing, 2-level sharding (ab/cd/hash), zstd compression.
.vlfs/
config.toml # Repo config (public_base_url, compression)
index.json # File manifest (committed)
~/.config/vlfs/
config.toml # User secrets (Drive OAuth)
rclone.conf # Generated rclone config
gdrive-token.json # OAuth token
.vlfs-cache/
objects/ # Local cache
Commit .vlfs/ to track your large files. Add .vlfs-cache/ to your .gitignore to keep the data blobs out of Git.
# Pull (no auth needed for public files)
python vlfs.py pull
# Push to R2 (requires credentials)
python vlfs.py push tools/clang.exe
python vlfs.py push tools/
python vlfs.py push --glob "**/*.dll"
python vlfs.py push --all
# Push to Drive (private)
python vlfs.py push --private assets/art.psd
# Status
python vlfs.py status
python vlfs.py verify
python vlfs.py cleanMost commands now keep default output intentionally terse with bracketed progress and a final summary line. Use -v when you want the older step-by-step detail and transfer chatter.
.vlfs/config.toml (committed):
[remotes.r2]
public_base_url = "https://pub-abc123.r2.dev/vlfs"
bucket = "my-project-assets" # Optional (default: "vlfs")
[remotes.gdrive]
bucket = "my-project-drive-root" # Optional (default: "vlfs")
[defaults]
compression_level = 3python vlfs.py auth gdriveThis opens a browser for you to authorise access. Done.
The CLI uses environment variables for authentication. Set these in your shell (or .env file):
export RCLONE_CONFIG_R2_ACCESS_KEY_ID="your-access-key"
export RCLONE_CONFIG_R2_SECRET_ACCESS_KEY="your-secret-key"
export RCLONE_CONFIG_R2_ENDPOINT="https://<accountid>.r2.cloudflarestorage.com"If you prefer not to use environment variables, you can create a persistent config file at ~/.config/vlfs/rclone.conf:
[r2]
type = s3
provider = Cloudflare
access_key_id = your-access-key
secret_access_key = your-secret-key
endpoint = https://<accountid>.r2.cloudflarestorage.cominclude(VLFSSync.cmake)
set(VLFSSYNC_AUTO ON) # Auto-pull on configure| Operation | Auth | Method |
|---|---|---|
pull (R2) |
None | HTTP GET |
pull (Drive) |
Token | rclone |
push (R2) |
Env vars | rclone |
push --private |
Token | rclone |
How does sharding work?
It uses directory sharding to keep folders clean (vlfs/ab/cd/hash), not file chunking. One local file equals one cloud object.
What compression is used? Zstandard (zstd) level 3. It's extremely fast for real-time compression and makes decompression feel instant.
Is it multithreaded? Mostly. Hashing and downloading (HTTP/Rclone) are parallel/multithreaded. Uploading is currently sequential. Google Drive transfers are single-threaded to respect API rate limits.
- Python 3.10+
zstandardrclone(push only)
Important
Windows Warning: winget often fails to create the required symlinks when run from PowerShell. To ensure rclone is correctly added to your PATH, run the installation from an Administrator Command Prompt (cmd.exe).
# Windows (Run as Administrator in cmd.exe)
winget install Rclone.Rclone
# macOS
brew install rclone
# Linux
curl https://rclone.org/install.sh | sudo bashpip install -e ".[dev]"
pytest