RAG Command Center
Real estate intelligence platform — Victoria, BC operations · Canada-wide public listings
If you find this useful, consider giving it a ⭐ — it helps others discover the project.
RAG Command Center is a full-stack real estate intelligence platform that combines a public consumer-facing listing site with an internal CRM, lead pipeline, and AI-powered signal engine for real estate professionals.
R = Ricki Kohli · A = Amit Khatkar · G = Gary Doman
Public Site — Canada-wide listings that anyone can browse, search, filter, and inquire on. Open access, hosted on GitHub Pages.
Command Center — Operations dashboard, CRM, pipeline, AI signals, commission tracking. Currently focused on Victoria, BC as the primary licensed market, with Vancouver as secondary. Internal access via SHA-256 auth.
The public site serves all of Canada. The command center concentrates on Victoria/Vancouver — lead routing, signal priority, auto-lead generation, and listing sort weight all focus on the licensed markets first. Licensed areas are configurable from the command center settings.
- Public Site: https://garebear99.github.io/RG-Command-Center/
- API Backend: https://rag-command-center-api.admension.workers.dev/api/health
Anyone visiting the .github.io page can:
- Browse listings across all Canadian provinces (BC, AB, ON, QC, MB, SK, NS, NB, and more)
- Search & filter by city, price, beds, baths, property type
- View deal scores — every listing is scored 0–100% on price positioning, days on market, price drops, and comparables
- Read resources — buyer guide, seller guide, mortgage calculator (CMHC-aware), blog, and team profiles
- Submit inquiries — contact forms fire to the Cloudflare Worker backend for capture
- Subscribe to listing alerts — enter criteria and get matched when new listings land
- Share listings — one-click social share to Facebook, Twitter, LinkedIn
index.html— Homepage with featured listings, hero CTA, inline lead capturedeals.html— Top deals ranked by composite deal scoredirectory.html— Province-aware listing browser with filterslisting-detail.html— Full detail view with GPS map, price history, score breakdownteam.html— Ricki, Amit, and Gary profiles and contact infomortgage.html— Canadian mortgage calculator with CMHC insurancebuyer-resources.html— BC buyer's guideseller-resources.html— BC seller's guideblog.html/blog-post.html— JSON-driven blog with posts by team members
The internal dashboard is where the team runs daily operations. It's currently configured for Victoria, BC as the primary licensed market, with Vancouver as secondary. All intelligence, lead routing, and priority scoring key off the licensed area setting.
command-center.html— Main dashboard: stats (hot / warm / cold / stale leads), call queue, top deals, signal feed, pipeline healthcontacts.html— CRM contact management with buyer/seller profiles, interaction history, tagspipeline.html— Kanban-style deal pipeline: new lead → qualified → showing → offer → closedsignals.html— AI Signal Paste: paste a Facebook group post, AI scores intent and extracts lead data. Also auto-compiles signals from Reddit + Victoria Open Datacommission.html— Commission tracker tied to partnership agreement tiered splitsanalytics.html— Market analytics: price trends, DOM averages, listing volumeemail-templates.html— Email template builder with personalization tokenssettings.html— Licensed area config, data pipeline controls, SMTP settingsleads.html— Lead operations view with hot/warm/cold/stale tabslistings.html— Internal listing management with source conflict resolutionadd.html— Manual listing and lead entry with auto-scoring
The command center lets you set your licensed province and cities. This affects:
- Lead routing — leads in licensed areas go to priority queues and assigned agents
- Listing sort priority — Victoria/Vancouver listings surface first in internal views
- Auto-lead generation — the engine only generates actionable leads (motivated seller, below market, investor signal, new listing, price drop) for listings in licensed cities
- Signal compilation — Reddit and open data scraping targets Victoria-area sources
Licensed areas are stored in localStorage and can be changed at any time from the dashboard or settings page.
The Cloudflare Worker scrapes public sources on a cron schedule (hourly) and on-demand:
- Reddit —
r/VictoriaBCandr/canadahousingfor real estate intent posts - Victoria Open Data — building permit activity (investor/flipper signals)
Each post is scored for buyer/seller/investor intent, neighbourhood mentions, budget signals, and credibility (0–100). Only posts containing real estate keywords qualify. Signals are stored in Workers KV with a 30-day TTL.
- URL-based — same Reddit permalink or source URL is never stored twice
- Fuzzy text — posts with similar content but different URLs are flagged as possible duplicates and branch-linked to the original signal (
duplicate_offield) - Frontend dedup — the signals page deduplicates locally before inserting into history
Leads and signals don't stay "hot" forever:
- Leads — decay to
staleafter 7 days without action - Signals — tagged
staleafter 14 days - Categories: Hot · Warm · Cold · Stale — each with distinct visual treatment across dashboard, call queue, and leads view
autoleads.js generates actionable leads from listing intelligence with zero external APIs:
- Motivated sellers (45+ DOM with price drop)
- Below-market deals (high deal score in licensed area)
- Investor signals (fixer + below market)
- New listing alerts (fresh listings scoring 40+)
- Price drop alerts (recent reductions)
The Signals page includes a manual paste tool for Facebook group posts. Paste any post from a Victoria/Vancouver real estate group and the AI engine will:
- Detect buyer/seller/investor/renter intent
- Extract budget, bed/bath requirements, timeline
- Identify neighbourhood from a 70+ hood dictionary (Victoria + Vancouver)
- Score 0–100 and classify as hot/warm/cold
- One-click create a CRM contact + pipeline deal from the scored signal
The backend runs on Cloudflare Workers (free tier) with KV storage. It handles:
GET /api/health— Health check + last compile timestampGET /api/compile— Scrape public sources, score, deduplicate, store signalsGET /api/signals— List compiled signals with freshness tagsPOST /api/inquiries— Capture public site inquiry submissionsGET /api/inquiries— List stored inquiries (internal)GET|PUT /api/inquiries/:id— Retrieve or update inquiry statusPOST /api/events— Analytics event collectionGET /api/stats— Global platform statisticsPOST /api/proxy— CORS proxy for external data feeds
The worker includes a scheduled() handler triggered hourly (0 * * * * in wrangler.toml). New signals are compiled automatically even when nobody has the dashboard open. Dedup ensures the same posts aren't re-stored.
Progressive per-IP rate limits with escalating timeouts on violation:
- Inquiries: 5/min, 30/hr, 100/day
- Fetches: 60/min, 600/hr, 3000/day
- Events: 30/min, 500/hr, 5000/day
- Proxy: 10/min, 60/hr, 200/day
Every listing receives a composite deal score (0–100%) from six weighted components:
- Below Market (35%) — $/sqft vs area median benchmarks
- Price Drop (20%) — reduction magnitude + recency
- Days on Market (15%) — freshness and motivation signals
- Area Comps (15%) — comparable listing density
- Features (10%) — bed/bath utility score
- Data Freshness (5%) — source age and staleness
Scores are fully transparent — the listing detail modal breaks down each component with weight, percentage, and explanation.
- 4 tile providers: CartoDB Dark, OSM Street, Esri Satellite, CartoDB Voyager
- Interactive controls: zoom, layer switching, overlay toggles (marker + 200m radius)
- Touch support: pinch-to-zoom, single-finger drag
- Listing card thumbnails with GPS precision indicators
EVE is an embedded chatbot in the command center that provides dashboard guidance, data summaries, feature navigation, and pattern-matched Q&A on platform usage.
git clone https://github.com/GareBear99/RG-Command-Center.git
cd RG-Command-CenterOpen index.html in any browser — the public site works immediately with the included dataset. No build step, no dependencies.
The Cloudflare Worker backend powers inquiry capture and automated signal compilation.
npm install -g wrangler
wrangler login
# Create KV namespace (first time only)
wrangler kv:namespace create "RAG_DATA"
# Update the namespace ID in wrangler.toml
wrangler deployAfter deploy, the worker runs at https://rag-command-center-api.<your-subdomain>.workers.dev. The hourly cron trigger activates automatically.
Push to main — GitHub Pages serves everything from the root:
https://<username>.github.io/RG-Command-Center/
python3 tools/populate_public_data.py --seed-mode off --no-existing-manual
python3 tools/audit_release_integrity.py├── Public Pages (Canada-wide)
│ ├── index.html Homepage + lead capture
│ ├── deals.html Top deals by score
│ ├── directory.html Province-aware listing browser
│ ├── listing-detail.html Full detail + GPS map
│ ├── team.html Team profiles
│ ├── mortgage.html Canadian mortgage calculator
│ ├── buyer-resources.html Buyer's guide (BC focus)
│ ├── seller-resources.html Seller's guide (BC focus)
│ ├── blog.html Blog index
│ └── blog-post.html Blog post detail
│
├── Command Center (Victoria / Vancouver focus)
│ ├── command-center.html Dashboard + stats + call queue
│ ├── contacts.html CRM contacts
│ ├── pipeline.html Deal pipeline kanban
│ ├── signals.html AI signal paste + auto-compile
│ ├── commission.html Commission tracker
│ ├── analytics.html Market analytics
│ ├── email-templates.html Email template builder
│ ├── settings.html Config + data pipeline
│ ├── leads.html Lead operations
│ ├── listings.html Internal listing management
│ └── add.html Manual listing/lead entry
│
├── Backend
│ ├── worker.js Cloudflare Worker API
│ └── wrangler.toml Worker config + cron trigger
│
├── assets/js/
│ ├── utils.js Shared utilities
│ ├── public.js Public page renderer
│ ├── command.js Internal dashboard renderer
│ ├── autoleads.js Auto-lead engine
│ ├── eve.js EVE AI assistant
│ ├── resolver.js Source reconciliation engine
│ ├── compiler.js Release compiler
│ ├── auth.js SHA-256 authentication
│ ├── gps-fallback-map.js Tile map engine
│ └── settings.js Pipeline UI controls
│
├── data/
│ ├── bootstrap.js Compiled runtime data
│ ├── team.json Team profiles
│ ├── listings.json Listing dataset
│ ├── leads.json Lead dataset
│ ├── markets.json Market configuration
│ ├── blog/ Blog post JSON files
│ ├── raw/ Source intake files
│ ├── internal/ Reconciled pipeline state
│ └── public/ Released public artifacts
│
├── tools/
│ ├── populate_public_data.py Data pipeline runner
│ ├── validate_local_pack.py Import pack validator
│ ├── audit_release_integrity.py Release integrity checker
│ ├── import-source.html Browser-based import tool
│ └── examples/ Source data templates
│
├── manifest.json PWA manifest
├── sw.js Service worker (offline cache)
└── README.md
- Internal pages use SHA-256 hashed password authentication
- All user input sanitized against XSS via
escapeHtml/escapeAttr - Cloudflare Worker sanitizes and length-limits all input fields
- Progressive rate limiting with IP hashing (SHA-256 salted — no raw IPs stored)
- No API keys or secrets in the codebase
- Victoria, BC — primary licensed market (lead routing, signal scraping, auto-leads)
- Vancouver, BC — secondary licensed market
- Canada-wide — public listing search and browsing on the
.github.iosite
- MLS/IDX integration when credentials are available
- SMS/Twilio campaign management
- Neighbourhood-specific SEO landing pages
- CASL compliance for email/SMS opt-in tracking
- VPS upgrade path for backend (currently Cloudflare Workers free tier)
- Additional licensed markets as the team expands
- ⭐ Star this repo
- 🍴 Fork and contribute
- 📣 Share with real estate professionals
- 🐛 Report issues or suggest features
MIT — RAG Realty Group (Ricki Kohli, Amit Khatkar & Gary Doman)
Built by GareBear99 · RAG Realty Group
Victoria · Vancouver · Canada-wide