CutRoom automates the video production workflow that typically requires a team of scriptwriters, designers, video editors, and sound engineers. One person can produce professional marketing videos end-to-end using AI.
Brief → Script → Shot Planning → Image/Video Generation → Review → Voiceover → Music → Montage → 4K Render
- 🎬 Script Generation — AI writes video scripts from a brief
- 🎨 Shot Planning — Automatic shot breakdown with scene descriptions
- 🖼 AI Image/Video Gen — fal.ai, Replicate, OpenRouter integration
- 👁 Director Review — Human-in-the-loop approval for each shot
- 🎙 Voiceover — ElevenLabs TTS with script-to-speech pipeline
- 🎵 Music — LLM-generated prompts for Suno + manual upload
- 🎞 Auto-Montage — Semantic anchor-first timeline assembly with weak-match review
- 📐 4K Render — Remotion-powered deterministic video rendering
- 🔄 LLM Refinement — Refine montage plan with natural language feedback
| Layer | Technology |
|---|---|
| Frontend | React 19 + TypeScript + Zustand + Tailwind CSS + Vite |
| Backend | Express 5 + file-based storage |
| AI/ML | OpenRouter, fal.ai, Replicate, ElevenLabs |
| Video | Remotion + ffmpeg (normalize, Ken Burns, encode) |
| Testing | Vitest (unit) + Playwright (E2E) |
npm install
npm run dev:all- Frontend:
http://localhost:5173 - API:
http://localhost:3001
The repository now includes a single-tenant self-hosted profile:
- docs/self-hosted.md
Dockerfiledocker-compose.self-hosted.yml.env.self-hosted.example
Quick start:
cp .env.self-hosted.example .env.self-hosted
docker compose -f docker-compose.self-hosted.yml up -d --buildThis profile runs:
- one
appcontainer serving both the frontend and/api - one
workercontainer for background jobs - one
postgrescontainer
PostgreSQL support is scaffolded for future backend work without changing the current file-based routes yet.
$env:DATABASE_URL = "postgres://postgres:postgres@localhost:5432/cut_room"
npm run db:migrate
npm run db:checkserver/db/index.tscreates a sharedpgpool and exposes a simple healthcheck.server/db/migrations/0001_initial.sqlis the first tracked migration.npm run db:migrateuses a PostgreSQL advisory lock so concurrent migration processes cannot both execute the same migration body.npm run db:checknow behaves like a real verification command: it fails ifDATABASE_URLis missing or if tracked migrations are still pending.tests/integration/setup.tsstays as the shared helper module for app/test bootstrap.tests/integration/setup.test.tssmoke-testscreateDb(...)with an explicit connection string and closes the pool without requiring a live PostgreSQL server.
server/
├── routes/montage.ts # Montage pipeline endpoints
├── lib/storage.ts # Project types & file-based storage
├── lib/montage-plan.ts # Heuristic plan generation
├── lib/normalize.ts # ffmpeg clip normalization
├── lib/config.ts # Global settings
└── lib/openrouter.ts # LLM integration
src/
├── components/ # React UI components
├── lib/api.ts # API client
├── stores/ # Zustand state
└── types/ # TypeScript interfaces
The montage pipeline now supports a video description first semantic planning pass:
voiceoverScript -> narration anchors -> video descriptions -> anchor matches -> draft montage plan
What this adds:
- approved shot videos can be described before planning
- narrator text can be split into ordered visual anchors
- anchors are matched against described videos with
matched / weak_match / unmatched - weak matches can be reviewed and overridden in the montage UI before draft generation
- OpenReel handoff now preserves semantic metadata and draft trims for future editor-side tooling
Operator guidance:
- use
Описать видеоbeforeИзвлечь якоря, so matching has stronger visual evidence - review
Требует проверки/Нет совпаденияanchors in the montage plan step - save manual shot overrides before generating the draft plan when semantic confidence is low
AGPL-3.0 License — see LICENSE for details.