Cloudflare Worker compatible runtime, based on Node.js.
This project enables you to self-host locally on any Node.js-compatible infrastructure, breaking free from Cloudflare Workers vendor lock-in.
Global install: npm install -g node-cf-worker
Assume your worker service has the following structure:
- src/
- wrangler.tomlYou can start the service by running node-cf-worker ./wrangler.toml.
Options: --port Specify the port to listen on (default: 8787)
- R2
- D1
- KV
- AI (via configuring OpenAI-compatible service)
D1 Simulated via local sqlite3
KV Just a sqlite3 table named kv
AI
Simulated via OpenAI-compatible API. Configure in wrangler.toml:
[ai]
binding = "AI"
base_url = "http://localhost:11434" # Optional, defaults to OPENAI_BASE_URL env var or https://api.openai.com/v1
api_key = "sk-xxx" # Optional, defaults to OPENAI_API_KEY env var
# Optional: map Cloudflare model IDs to your provider's model names
[ai.modelMapping]
"@cf/meta/llama-3.1-8b-instruct" = "llama3.1:8b"
"@cf/meta/llama-3.3-70b-instruct" = "llama3.3:70b"Priority: api_key/base_url in toml > OPENAI_API_KEY/OPENAI_BASE_URL env var > default.
modelMapping translates Cloudflare model IDs to your provider's model names. Requests with unmapped IDs are forwarded as-is.
R2 A sqlite3 table named r2 combined with local filesystem