Skip to content
/ CutIQ Public

analyzes any short video and visualizes its edit structure — cut timestamps, transition types, shot lengths, and a pacing heatmap — so creators and editors can replicate what works.

Notifications You must be signed in to change notification settings

prntcena/CutIQ

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

2 Commits
 
 
 
 
 
 
 
 
 
 

Repository files navigation

CutIQ

FastAPI + Celery backend with Supabase, and React + Vite + Tailwind + Chart.js frontend.

Architecture

  • Backend (backend/):
    • FastAPI exposes:
      • POST /analyze – inserts a job in Supabase and enqueues a Celery task.
      • GET /results/{job_id} – returns job status and, when ready, JSON analysis from Supabase Storage.
    • Celery worker downloads the video via yt-dlp, detects scenes with PySceneDetect, computes metrics, uploads JSON to Supabase Storage, and updates job status.
    • Uses Redis for Celery broker/backend.
  • Frontend (frontend/):
    • Simple SPA with a URL input, progress display, and chart of scene durations.
    • Calls backend configured via VITE_API_BASE_URL.

Prerequisites

  • Redis
  • ffmpeg (Dockerfile installs it; for local Windows ensure ffmpeg in PATH or set FFMPEG_LOCATION).
  • Supabase project with:
    • Table jobs (SQL below)
    • Storage bucket (default: cutiq-results)
-- Supabase table (example)
create table if not exists public.jobs (
  id uuid primary key,
  url text not null,
  status text not null default 'queued',
  result_path text,
  num_cuts integer,
  average_duration double precision,
  error text,
  created_at timestamp with time zone default now(),
  updated_at timestamp with time zone default now()
);

-- Optional trigger to auto-update updated_at
create or replace function set_updated_at()
returns trigger as $$
begin
  new.updated_at = now();
  return new;
end;
$$ language plpgsql;

drop trigger if exists set_updated_at on public.jobs;
create trigger set_updated_at before update on public.jobs
for each row execute procedure set_updated_at();

Create a Storage bucket (e.g., cutiq-results) and allow authenticated uploads. The backend uses the Service Role key (server-side only).

Backend: Run locally

python -m venv .venv
. .venv/bin/activate  # Windows: .\.venv\Scripts\activate
pip install -r backend/requirements.txt

# Set environment variables (copy backend/env.example)
# Required: SUPABASE_URL, SUPABASE_SERVICE_ROLE_KEY, REDIS_URL

# Run API
uvicorn app.main:app --app-dir backend --reload

# In another terminal, run Celery worker
celery -A app.celery_app.celery_app worker --loglevel=info --pool=solo

Docker (API):

docker build -t cutiq-backend -f backend/Dockerfile .
docker run --rm -p 8000:8000 --env-file backend/env.example cutiq-backend

Frontend: Run locally

cd frontend
npm install
# copy env.example -> .env (or set VITE_API_BASE_URL)
npm run dev

Deployment Notes

  • Backend (Render/Railway):
    • Build: pip install -r backend/requirements.txt
    • Start command (web): uvicorn app.main:app --host 0.0.0.0 --port 8000 --app-dir backend
    • Start command (worker): celery -A app.celery_app.celery_app worker --loglevel=info --workdir backend
    • Set env vars: SUPABASE_URL, SUPABASE_SERVICE_ROLE_KEY, REDIS_URL, SUPABASE_RESULTS_BUCKET.
  • Frontend (Vercel/Netlify):
    • Set VITE_API_BASE_URL to the deployed backend URL.

API

  • POST /analyze

    • Body: { "url": "<YouTube/Reels URL>" }
    • Response: { "job_id": "<uuid>", "status": "queued" }
  • GET /results/{job_id}

    • Response: { "job_id": "<uuid>", "status": "..." , "data": { ... } | null, "error": string | null }

Notes & Troubleshooting

  • Start backend (FastAPI)
cd backend
uvicorn app.main:app --reload --host 0.0.0.0 --port 8000
  • Start Celery worker

    • On Linux/macOS:
    cd backend
    celery -A app.celery_app.celery_app worker --loglevel=info
    • On Windows (use solo pool; prefork doesn't work on Windows):
    cd backend
    celery -A app.celery_app.celery_app worker --loglevel=info --pool=solo
  • Important environment variable notes (edit backend/.env):

    • Set SUPABASE_URL and SUPABASE_SERVICE_ROLE_KEY (service role key required for server uploads).
    • SUPABASE_RESULTS_BUCKET default: cutiq-results.
    • REDIS_URL must point to the same Redis instance used by Celery.
    • CELERY_RESULT_BACKEND is optional; if omitted the app will use REDIS_URL. If you set it, ensure it uses the same Redis instance and correct protocol (redis:// or rediss://).
  • Supabase Storage

    • Create a storage bucket (default cutiq-results) and ensure the service role can upload (see backend/SETUP_DATABASE.md).
    • The backend uploads result JSON to results/{job_id}.json.
  • Windows-specific issues we handled

    • Temporary-file locking: tasks now explicitly close files and manually remove temp dirs (small sleep before cleanup) to avoid [WinError 32] when uploading/cleaning temp files.
    • Celery worker must be started with --pool=solo on Windows.
  • Dependency & runtime notes

    • yt-dlp pinned to yt-dlp==2024.9.27 to avoid websockets conflicts.
    • opencv-python added (required by scenedetect).
    • If you change task code, restart both the FastAPI server and the Celery worker so tasks are re-registered.
  • Debugging tips

    • Check FastAPI logs for incoming requests and Supabase calls.
    • Check Celery worker logs for task lifecycle and Redis connection lines.
    • If tasks show Received unregistered task, ensure the worker process was started after code changes and imports app.tasks.

About

analyzes any short video and visualizes its edit structure — cut timestamps, transition types, shot lengths, and a pacing heatmap — so creators and editors can replicate what works.

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published