Open-source, AI-powered video production engine built on Remotion.
Define a timeline in TypeScript, point it at your footage and transcript, render a polished MP4.
Built for product demos, LinkedIn content, YouTube videos, and social media clips.
- What is OpenCut?
- Quick Start
- Why OpenCut?
- Installation
- Workflow
- CLI Commands
- Features
- Engine Components
- Animation Utilities
- Music Sync
- Background Effects
- Plugin API
- API Documentation
- Project Structure
- Creating a New Video
- Server Rendering
- Examples
- Contributing
- Changelog
- License
OpenCut is a programmatic video production engine that turns raw footage, screen recordings, and transcripts into professional MP4 videos using code. It combines:
- AI transcription via OpenAI Whisper for word-level captions and automated subtitles
- React + Remotion for frame-accurate, programmatic video rendering
- TypeScript-driven timelines for declarative, version-controlled video editing
- A plugin system for custom segments, background effects, and timeline transforms
Unlike traditional video editors that lock you into GUIs and proprietary formats, OpenCut treats video as code — fully version-controlled, reproducible, and automatable.
- Founders & marketers creating product demo videos at scale
- Content creators batch-producing LinkedIn and YouTube videos
- Educators building automated course content with subtitles
- Developers who want programmatic video generation in their apps
- Agencies rendering client videos from data-driven templates
Get a rendered MP4 in under 5 minutes:
# 1. Clone and install
git clone https://github.com/floomhq/opencut.git
cd opencut
npm install
# 2. Scaffold a new video project
npx opencut-init my-video
# 3. Drop your facecam.mp4 into public/
# 4. Edit the timeline
# src/examples/my-video/timeline.ts
# 5. Validate assets and timings
npx opencut-validate src/examples/my-video/timeline.ts
# 6. Render your video
npx opencut-render my-videoFind your finished video at out/my-video.mp4.
For a detailed walkthrough, see QUICKSTART.md.
| Problem | Traditional Editors | OpenCut |
|---|---|---|
| Batch rendering | Manual export per video | npx opencut-render project-name |
| Subtitles | Manual typing or expensive services | Whisper AI → auto-generated word-level captions |
| Version control | Binary project files | Plain-text TypeScript — diffable, reviewable |
| Automation | No API | Full CLI + programmable timeline |
| Custom effects | Limited presets | React components + plugin system |
| Reproducibility | "It works on my machine" | Deterministic rendering from code |
- Node.js 20+
- npm or pnpm
- (Optional) Whisper for AI transcription
- (Optional) Chrome/Chromium for Remotion rendering
git clone https://github.com/floomhq/opencut.git
cd opencut
npm installnpm install -g opencutOr install locally in your project:
npm install opencutRecord ──► Transcribe ──► Configure ──► Validate ──► Render
│ │ │ │ │
│ │ │ │ ▼
facecam Whisper CLI timeline.ts validate.ts out/*.mp4
+ screen --word_ + config.ts checks assets
timestamps
- Record raw facecam footage, screen recordings, and screenshots.
- Transcribe with Whisper (
--word_timestamps True) for word-level captions. - Configure your timeline and video settings in TypeScript.
- Validate that all assets exist and timings line up.
- Render with Remotion.
| Command | Description |
|---|---|
npx opencut-init <name> |
Scaffold a new project with config, timeline, subtitles, Root.tsx, and index.ts |
npx opencut-transcribe <video> |
Run Whisper AI transcription or generate a subtitle template |
npx opencut-validate <timeline.ts> |
Check asset paths, segment durations, and timeline contiguity |
npx opencut-render <name> |
Render the full video for a project |
npx opencut-render <name> --preview |
Open the Remotion Studio preview UI |
npx opencut-render <name> --watch |
Watch files and auto-re-render on changes |
npx opencut-render <name> --frames 0-149 |
Render a specific frame range |
npm run typecheck |
Run TypeScript type checking across the entire codebase |
npm run test |
Run the full test suite (125 tests) |
npm run test:coverage |
Run tests with code coverage reporting |
npm run build |
Compile TypeScript to JavaScript with declaration files |
npx remotion studio src/examples/<name>/index.ts |
Open the Remotion preview UI in the browser |
- Plugin system — Register custom segment renderers, background effects, and timeline transforms. See Plugin API.
- Watch mode —
opencut-render --watchauto-re-renders when you save timeline.ts, config.ts, or Root.tsx. - 125 tests — Full test coverage including integration tests that render actual frames via Remotion.
- Generative background effects — Layer animated orbs, particles, grid, waves, dots, or vignette behind any segment without extra assets.
- TypingText component — Character-by-character typewriter animation with blinking cursor, configurable speed, and accent-color glow.
- CrossfadeScene wrapper — Reusable asymmetric fade-in / fade-out scene transition with cubic easing.
- Music sync / beat alignment — Lock scene transitions to musical tempo with
beatsToFrames,barsToFrames, and synced volume curves. - Animation utilities — Pure helper library for particles, wave motion, breathing effects, stagger reveals, lerp colors, Ken Burns presets, and audio-reactivity simulation.
- AI-powered transcription — Word-level captions and automated subtitles with active-word highlighting, powered by OpenAI Whisper
- Social media ready — Render 1080p horizontal, 1920p vertical, square, and 4:5 MP4s optimized for LinkedIn, YouTube, Instagram, and TikTok
- Timeline-driven editing — Declarative TypeScript timeline segments; no drag-and-drop, no UI bloat
- Face-cam picture-in-picture — Circular face bubble with configurable position, size, and filters
- Keyword overlays — Large animated text callouts to emphasize key moments in your video
- Notification banners — macOS-style slide-in popups (WhatsApp, iMessage, generic)
- Title and end cards — Full-screen branded cards with customizable CTAs
- Background effects — Orbs, particles, grid, waves, dots, vignette — no extra assets needed
- Music beat synchronization — Sync scene cuts and volume to musical tempo
- Open source — MIT license, fully hackable TypeScript/React codebase
| Component | Description |
|---|---|
VideoComposition |
Top-level component. Takes timeline, config, subtitles; sequences everything into the final video. |
Segment |
Renders one timeline segment: background (facecam/screenshot/video/slides) + overlays. |
FaceBubble |
Circular face-cam picture-in-picture. Configurable position, size, and filters. |
SubtitleOverlay |
Groups words into 3-7 word phrases, highlights the active word. AI-transcribed captions from Whisper. |
KeywordOverlay |
Large uppercase text with scale-in animation, positioned at the top. |
TitleCard |
Full-screen title overlay with configurable text and colors. |
EndCard |
Full-screen end card with CTA button and URL pill. |
NotificationBanner |
macOS-style slide-in notifications. Presets: WhatsApp (green), iMessage (blue), generic (gray). |
BackgroundEffects |
Generative SVG backgrounds: orbs, particles, grid, waves, dots, vignette. |
TypingText |
Typewriter animation with optional blinking cursor, speed control, and accent glow. |
CrossfadeScene |
Wraps a scene in <AbsoluteFill> with asymmetric fade-in / fade-out opacity. |
AnimatedDiagram |
Step-by-step animated diagrams with reveal timing. |
ComparisonTable |
Side-by-side feature comparison with animated rows. |
AppMockup |
Device frame mockups for app demos. |
AudioWaveform |
Real-time audio waveform visualization. |
VideoBackground |
Full-motion video backgrounds. |
All style props (SubtitleStyle, KeywordStyle, CardStyle, EndCardStyle) are optional overrides on VideoComposition.
Import pure helpers from src/engine/animation:
generateParticles(count, seed)— seeded, deterministic particle arrayupdateParticlePosition(particle, frame, bounds)— frame-based position with wrappingwaveMotion(frame, frequency, amplitude, phase)— sine-wave motionbreathe(frame, minScale, maxScale, speed)— pulsing scale between two valuesglitchOffset(frame, intensity)— randomized offset for glitch effectslerpColor(color1, color2, t)— linear interpolation between hex colorsstagger(index, totalItems, totalDurationSeconds, frame, fps)— 0-1 reveal progress for list itemstypewriter(text, frame, fps, charsPerSecond)— substring for frame-accurate typinganimatedCounter(from, to, progress, decimals)— number interpolation witheaseOutExpogetKenBurnsTransform(preset, progress)— cinematic scale/translate for background footagesimulateAudioReactivity(frame, intensity)— frame-pulsed 0-1 value for bass-hit simulationspringPresets— Pre-configured spring physics presets for natural motioneasing— Curated easing functions:easeInOutCubic,easeOutExpo,easeInBack, etc.
Import beat-aware helpers from src/engine/MusicSync:
getGridBPM(fps, framesPerBeat)— derive target BPM from grid settingsgetBeatSyncPlaybackRate(sourceBPM, targetBPM)— playback rate to lock music to tempobeatsToFrames(beats, bpm, fps)— exact frame count for N beatsbarsToFrames(bars, bpm, fps)— exact frame count for N bars (4/4)buildSceneStarts(durations, overlapFrames)— compute crossfade start framesgetSyncedMusicVolume(frame, totalFrames, baseVolume, fadeInFrames, fadeOutFrames)— volume curve that fades in at the start and out at the end
Add a backgroundEffect field to any TimelineSegment:
backgroundEffect: {
type: "orbs", // or "particles" | "grid" | "waves" | "dots" | "vignette"
accentColor: "#3b82f6",
intensity: 0.5,
orbCount: 2, // only for "orbs"
particleCount: 40, // only for "particles"
}Effects are rendered as SVG or CSS layers behind the segment content and animate automatically based on the current frame.
OpenCut supports plugins for custom segment types, background effects, and timeline transformations.
import { registerPlugin } from "./engine";
registerPlugin({
name: "my-plugin",
segmentRenderers: {
"custom-scene": CustomSceneComponent,
},
backgroundEffectRenderers: {
"stars": StarsEffectComponent,
},
transformTimeline: (timeline) => {
// Modify or augment timeline before rendering
return timeline;
},
});| Hook | Description |
|---|---|
segmentRenderers |
Map segment type strings to React components. Rendered instead of built-in segments. |
backgroundEffectRenderers |
Map backgroundEffect.type strings to React components. |
transformTimeline |
Receive the full timeline array, return a modified timeline. Applied in registration order. |
registerPlugin(plugin)— Register a plugin globally.unregisterPlugin(name)— Remove a plugin by name.clearPlugins()— Remove all plugins.getSegmentRenderer(type)— Look up a custom segment renderer.getBackgroundEffectRenderer(type)— Look up a custom background effect.applyTimelineTransforms(timeline)— Apply all registered timeline transforms.
Browse the live API documentation or generate locally:
# Generate docs (outputs to docs/api/)
npm run docs:generate
# Serve docs locally on http://localhost:3000
npm run docs:serveThe generated documentation covers all engine components, TypeScript types, interfaces, and utilities exported from src/engine/index.ts.
src/
engine/ # Reusable video components and utilities
types.ts # All TypeScript interfaces
Composition.tsx # Sequences timeline segments + audio
Segment.tsx # Renders one segment (background + overlays)
FaceBubble.tsx # Circular face-cam PiP
SubtitleOverlay.tsx # Word-level captions with active word highlight
KeywordOverlay.tsx # Large keyword text overlays
TitleCard.tsx # Full-screen title card
EndCard.tsx # Full-screen end card with CTA
NotificationBanner.tsx # Slide-in notification popups
BackgroundEffects.tsx # Generative background effects
CrossfadeScene.tsx # Reusable fade-in/fade-out scene wrapper
TypingText.tsx # Typewriter text animation with cursor
animations.ts # Pure animation utilities
MusicSync.ts # Beat-to-frame math and synced volume curves
plugin.ts # Plugin registry for custom renderers
index.ts # Public API exports
cli/
init.ts # Scaffold a new video project
transcribe.ts # Whisper wrapper / subtitle template generator
validate.ts # Timeline and asset validator
render.ts # Render helper with --watch, --preview, --frames
workflow/
loader.ts # JSON/YAML project config loader
validator.ts # Project config validation
generator.ts # Remotion file generator from project config
types.ts # Workflow types
examples/
quickstart/ # Minimal 2-segment example
openslides/ # Product demo video
format-demo/ # Background effects and multi-format demo
ai-engineer-basics/ # Long-form educational video
floom-launch/ # SaaS product launch with beat-synced crossfades
hyperniche-launch/ # Competitive comparison + animated diagram
opendraft-research/ # Educational research with kinetic typography
Create a folder under src/examples/your-project/ with three files:
import type { VideoConfig } from "../../engine";
export const MY_CONFIG: VideoConfig = {
playbackRate: 1.2,
fps: 30,
width: 1920,
height: 1080,
crossfadeFrames: 10,
facecamAsset: "my-facecam.mov",
bgMusicAsset: "music.mp3",
bgMusicVolume: 0.06,
};import type { TimelineSegment } from "../../engine";
export const TIMELINE: TimelineSegment[] = [
{
id: "intro",
type: "facecam-full",
facecamStartSec: 0,
durationSec: 10,
faceBubble: "hidden",
showSubtitles: true,
showTitleCard: true,
backgroundEffect: {
type: "orbs",
accentColor: "#3b82f6",
intensity: 0.5,
orbCount: 2,
},
},
{
id: "demo",
type: "screen-static",
facecamStartSec: 10,
durationSec: 20,
faceBubble: "bottom-left",
showSubtitles: true,
screenImage: "my-screenshot.png",
keywords: [
{ text: "Key Feature", startSec: 12, endSec: 16 },
],
backgroundEffect: {
type: "grid",
accentColor: "#a78bfa",
intensity: 0.4,
},
},
];import React from "react";
import { Composition } from "remotion";
import { VideoComposition, computeTotalFrames } from "../../engine";
import { MY_CONFIG } from "./config";
import { TIMELINE } from "./timeline";
import { SUBTITLE_SEGMENTS } from "./subtitles";
const totalFrames = computeTotalFrames(TIMELINE, MY_CONFIG);
const MyVideo: React.FC = () => (
<VideoComposition
timeline={TIMELINE}
videoConfig={MY_CONFIG}
subtitleSegments={SUBTITLE_SEGMENTS}
titleCardTitle="My Product"
titleCardSubtitle="The tagline"
endCardTitle="My Product"
endCardCtaText="Try it now"
endCardUrl="myproduct.com"
/>
);
export const RemotionRoot: React.FC = () => (
<Composition
id="MyVideo"
component={MyVideo}
durationInFrames={totalFrames}
fps={MY_CONFIG.fps}
width={MY_CONFIG.width}
height={MY_CONFIG.height}
/>
);Place assets in public/ (create the directory if it doesn't exist — it is gitignored), then render:
npx remotion render src/examples/your-project/index.ts MyVideo out/my-video.mp4timeout 10m npx remotion render src/examples/openslides/index.ts OpenSlidesDemo out/video.mp4Use timeout to prevent runaway renders on CI or server environments.
| Example | Description | Render Command |
|---|---|---|
| quickstart | Minimal title + end card | npm run test-render |
| openslides | Product demo with facecam + slides | npm run render |
| format-demo | Background effects + typing text + multi-format | npx remotion render src/examples/format-demo/index.ts FormatDemo out/format-demo.mp4 |
| ai-engineer-basics | Long-form educational video with inline panels | npm run ai-basics:render |
| floom-launch | SaaS launch with beat-synced crossfades | npm run floom:render |
| hyperniche-launch | Competitive comparison + animated diagram | npm run hyperniche:render |
| opendraft-research | Research video with kinetic typography | npm run opendraft:render |
We welcome contributions! Please see CONTRIBUTING.md for guidelines on reporting issues, submitting pull requests, and coding standards.
See CHANGELOG.md for a detailed history of releases and changes.
OpenCut is released under the MIT License.
Built with ❤️ by Federico De Ponte