FastAPI + Celery backend with Supabase, and React + Vite + Tailwind + Chart.js frontend.
- Backend (
backend/):- FastAPI exposes:
POST /analyze– inserts a job in Supabase and enqueues a Celery task.GET /results/{job_id}– returns job status and, when ready, JSON analysis from Supabase Storage.
- Celery worker downloads the video via
yt-dlp, detects scenes withPySceneDetect, computes metrics, uploads JSON to Supabase Storage, and updates job status. - Uses Redis for Celery broker/backend.
- FastAPI exposes:
- Frontend (
frontend/):- Simple SPA with a URL input, progress display, and chart of scene durations.
- Calls backend configured via
VITE_API_BASE_URL.
- Redis
- ffmpeg (Dockerfile installs it; for local Windows ensure ffmpeg in PATH or set
FFMPEG_LOCATION). - Supabase project with:
- Table
jobs(SQL below) - Storage bucket (default:
cutiq-results)
- Table
-- Supabase table (example)
create table if not exists public.jobs (
id uuid primary key,
url text not null,
status text not null default 'queued',
result_path text,
num_cuts integer,
average_duration double precision,
error text,
created_at timestamp with time zone default now(),
updated_at timestamp with time zone default now()
);
-- Optional trigger to auto-update updated_at
create or replace function set_updated_at()
returns trigger as $$
begin
new.updated_at = now();
return new;
end;
$$ language plpgsql;
drop trigger if exists set_updated_at on public.jobs;
create trigger set_updated_at before update on public.jobs
for each row execute procedure set_updated_at();Create a Storage bucket (e.g., cutiq-results) and allow authenticated uploads. The backend uses the Service Role key (server-side only).
python -m venv .venv
. .venv/bin/activate # Windows: .\.venv\Scripts\activate
pip install -r backend/requirements.txt
# Set environment variables (copy backend/env.example)
# Required: SUPABASE_URL, SUPABASE_SERVICE_ROLE_KEY, REDIS_URL
# Run API
uvicorn app.main:app --app-dir backend --reload
# In another terminal, run Celery worker
celery -A app.celery_app.celery_app worker --loglevel=info --pool=soloDocker (API):
docker build -t cutiq-backend -f backend/Dockerfile .
docker run --rm -p 8000:8000 --env-file backend/env.example cutiq-backendcd frontend
npm install
# copy env.example -> .env (or set VITE_API_BASE_URL)
npm run dev- Backend (Render/Railway):
- Build:
pip install -r backend/requirements.txt - Start command (web):
uvicorn app.main:app --host 0.0.0.0 --port 8000 --app-dir backend - Start command (worker):
celery -A app.celery_app.celery_app worker --loglevel=info --workdir backend - Set env vars:
SUPABASE_URL,SUPABASE_SERVICE_ROLE_KEY,REDIS_URL,SUPABASE_RESULTS_BUCKET.
- Build:
- Frontend (Vercel/Netlify):
- Set
VITE_API_BASE_URLto the deployed backend URL.
- Set
-
POST
/analyze- Body:
{ "url": "<YouTube/Reels URL>" } - Response:
{ "job_id": "<uuid>", "status": "queued" }
- Body:
-
GET
/results/{job_id}- Response:
{ "job_id": "<uuid>", "status": "..." , "data": { ... } | null, "error": string | null }
- Response:
- Start backend (FastAPI)
cd backend
uvicorn app.main:app --reload --host 0.0.0.0 --port 8000-
Start Celery worker
- On Linux/macOS:
cd backend celery -A app.celery_app.celery_app worker --loglevel=info- On Windows (use solo pool; prefork doesn't work on Windows):
cd backend celery -A app.celery_app.celery_app worker --loglevel=info --pool=solo
-
Important environment variable notes (edit
backend/.env):- Set
SUPABASE_URLandSUPABASE_SERVICE_ROLE_KEY(service role key required for server uploads). SUPABASE_RESULTS_BUCKETdefault:cutiq-results.REDIS_URLmust point to the same Redis instance used by Celery.CELERY_RESULT_BACKENDis optional; if omitted the app will useREDIS_URL. If you set it, ensure it uses the same Redis instance and correct protocol (redis://orrediss://).
- Set
-
Supabase Storage
- Create a storage bucket (default
cutiq-results) and ensure the service role can upload (seebackend/SETUP_DATABASE.md). - The backend uploads result JSON to
results/{job_id}.json.
- Create a storage bucket (default
-
Windows-specific issues we handled
- Temporary-file locking: tasks now explicitly close files and manually remove temp dirs (small sleep before cleanup) to avoid
[WinError 32]when uploading/cleaning temp files. - Celery worker must be started with
--pool=soloon Windows.
- Temporary-file locking: tasks now explicitly close files and manually remove temp dirs (small sleep before cleanup) to avoid
-
Dependency & runtime notes
yt-dlppinned toyt-dlp==2024.9.27to avoid websockets conflicts.opencv-pythonadded (required byscenedetect).- If you change task code, restart both the FastAPI server and the Celery worker so tasks are re-registered.
-
Debugging tips
- Check FastAPI logs for incoming requests and Supabase calls.
- Check Celery worker logs for task lifecycle and Redis connection lines.
- If tasks show
Received unregistered task, ensure the worker process was started after code changes and importsapp.tasks.