Analytix is a self-hosted website analytics and KPI dashboard platform built as a production-style full-stack portfolio project. It combines a Laravel API, React dashboard, PostgreSQL storage, Dockerized local development, and an embeddable tracking script.
Website
↓
tracker.js
↓
Laravel API
↓
PostgreSQL
↓
React Dashboard
Backend:
- Laravel 12
- Laravel Sanctum
- PostgreSQL
- Queue-ready event ingestion
- Reverb-ready broadcast event architecture
- REST API
Frontend:
- React
- Vite
- TypeScript
- React Router
- Zustand
- TanStack Query
- Axios
- Tailwind CSS
- Recharts
DevOps:
- Docker
- Docker Compose
analytix/
├── backend/
├── frontend/
├── docs/
├── docker-compose.yml
├── README.md
└── .gitignore
Start the full stack:
docker compose up --buildServices:
- Frontend:
http://localhost:5174 - Backend API:
http://localhost:8000 - PostgreSQL:
localhost:5432
Seeded demo login:
Email: demo@analytix.dev
Password: password
The backend container runs:
composer install
php artisan migrate --seed
php artisan serve --host=0.0.0.0 --port=8000The frontend container runs:
npm install
npm run dev -- --host 0.0.0.0The frontend can also run in demo mode for Netlify, Vercel, or any static host. Demo mode is meant for a public portfolio link where recruiters can explore the product UI without requiring a hosted Laravel API, PostgreSQL database, or queue worker.
Enable it with:
VITE_DEMO_MODE=true
In demo mode:
- authentication uses a local demo user
- dashboard APIs return curated sample analytics
- charts, filters, pages, realtime, settings, dark mode, and responsive layout remain interactive
- no backend requests are required
- the Settings page explains that real tracker ingestion needs the self-hosted Docker setup
Recommended Vercel settings:
Root Directory: frontend
Build Command: npm run build
Output Directory: dist
Environment Variable: VITE_DEMO_MODE=true
Recommended Netlify settings:
Base directory: frontend
Build command: npm run build
Publish directory: dist
Environment variable: VITE_DEMO_MODE=true
The repository still contains the full self-hosted product. Anyone who wants to test real database writes, CORS behavior, and tracker ingestion can clone the repo and run Docker locally.
Add this snippet to any website you want to track:
<script
defer
src="http://localhost:8000/tracker.js"
data-site-id="my-website"
></script>data-site-id is the public site key used to keep analytics separated per website. Use demo-site only for the seeded demo data; use a different value for a real website.
The tracker automatically captures:
- pageviews
- SPA navigation changes
- pathname
- referrer
- screen size
- browser
- device category
- visitor timezone
- timestamp
It sends events to:
POST /api/track
Analytix uses two different CORS postures because the dashboard API and the tracking API have different jobs.
Dashboard and auth routes are restricted to the configured frontend origin:
FRONTEND_URL=http://localhost:5174
SANCTUM_STATEFUL_DOMAINS=localhost:5174,127.0.0.1:5174
That keeps authenticated endpoints such as /api/dashboard/overview, /api/websites, and /api/auth/me tied to the React dashboard.
Tracking is intentionally more open. The /api/track endpoint is designed to receive events from websites where the script is embedded, including local dev sites like http://localhost:3000 and public sites such as https://example.com.
For local development, this is allowed out of the box:
TRACKING_ALLOWED_ORIGINS=*
This only applies to POST /api/track and its preflight request. It does not open the authenticated dashboard API.
The tracking endpoint is safe to expose in the same way most analytics ingestion endpoints are public-facing:
- it does not require or send dashboard credentials
- it accepts a narrow, validated payload
- it is rate limited with
throttle:120,1 - it stores analytics events, not private user account data
- malformed requests are rejected before ingestion
For a stricter production deployment, replace * with a comma-separated allowlist:
TRACKING_ALLOWED_ORIGINS=https://myshop.com,https://www.myshop.com
Public:
GET /api/healthPOST /api/auth/registerPOST /api/auth/loginPOST /api/track
Authenticated:
GET /api/auth/mePOST /api/auth/logoutGET /api/websitesGET /api/dashboard/overviewGET /api/dashboard/pagesGET /api/dashboard/realtime
The ingestion path is intentionally queue-ready:
POST /api/trackvalidates the event payload.IngestAnalyticsEventis dispatched.TrackingIngestionServiceresolves the website, session, event, and pageview records.AnalyticsEventReceivedis ready to broadcast events for future Reverb-powered realtime UI.
Core analytics tables:
websitessessionsanalytics_eventspageviews
Local development uses the sync queue driver so tracking data appears immediately. Switch QUEUE_CONNECTION=database and run php artisan queue:work when you want a separate worker process. Laravel session storage is configured as cookie-based for local development so the analytics sessions table remains clean and domain-specific.
The dashboard app is organized around reusable product surfaces:
components/layouts/pages/hooks/services/stores/lib/types/
State responsibilities:
authStore: current user, login/register/logout, auth bootstrapuiStore: dark mode and responsive sidebar statefilterStore: analytics date range filters
Data fetching is centralized through:
src/lib/api.tssrc/lib/queryClient.tssrc/services/*src/hooks/useDashboardQueries.ts