- Overview
- Quick Start
- Website
- Architecture
- End-to-End Type Safety
- Development Workflow
- Database & Migrations
- Supabase Development Rules
- Environment Configuration
- Project Structure
- Development Commands
- Contributing
Echo helps users automatically generate metadata for YouTube videos (titles, subtitles, chapters, descriptions) using Google Gemini AI.
- 🎥 Video Upload & Processing - Upload videos and get AI-generated metadata
- 🤖 AI-Powered Analysis - Uses Google Gemini for transcription and content analysis
- 🔐 Secure Authentication - Supabase auth with user data isolation
- ☁️ Cloud Storage - Google Cloud Storage for video files
- 📱 Modern UI - React frontend with TanStack Router
- 🔒 End-to-End Type Safety - Database-driven type generation across the entire stack
- Bun 1.0+
- Supabase CLI
- Docker (for local Supabase)
- Clone and setup environment:
git clone <repository-url>
cd echo
cp .env.example .env
# Edit .env with your configuration- Install dependencies:
bun install- Configure Google OAuth (Required for Authentication):
For local development, you need to configure Google OAuth with specific redirect URIs:
Google Cloud Console Setup:
- Go to Google Cloud Console → APIs & Services → Credentials
- Edit your OAuth 2.0 Client ID
- Add these Authorized redirect URIs:
http://127.0.0.1:54321/auth/v1/callback http://localhost:3000/auth/callback - Save changes and wait 2-3 minutes for propagation
Why both URIs are needed:
http://127.0.0.1:54321/auth/v1/callback- For Supabase to handle OAuth PKCE flowhttp://localhost:3000/auth/callback- For your app's final redirect
- Start development environment:
bun devThat's it! This single command starts:
- Local Supabase database
- TypeScript Hono/tRPC backend (port 8000)
- TypeScript frontend (port 3000)
Visit http://localhost:3000 to access the application.
If you encounter OAuth authentication issues:
"App doesn't comply with Google's OAuth 2.0 policy" Error:
- Ensure you've added
http://127.0.0.1:54321/auth/v1/callbackto Google Cloud Console - Use
127.0.0.1notlocalhostin the redirect URI - Wait 2-3 minutes after saving changes in Google Cloud Console
"Invalid flow state, no valid flow state found" Error:
- This indicates the redirect URI configuration is incorrect
- Verify the Supabase config uses
http://127.0.0.1:54321/auth/v1/callback - Restart Supabase after config changes:
bun db:stop && bun db:start
OAuth works but session not persisting:
- Check that both redirect URIs are configured in Google Cloud Console
- Verify your environment variables are correctly set
The Echo web application is built with modern React and TanStack Router, providing a fast and intuitive user experience for AI-powered video metadata generation.
- Framework: Next.js 14 with App Router for server-side rendering
- Styling: Tailwind CSS with shadcn/ui components
- State Management: tRPC for server state, React hooks for local state
- Authentication: Supabase Auth with protected routes and user session management
- API Integration: Type-safe tRPC client for backend communication
- Error Handling: Comprehensive error boundaries and user-friendly error messages
- 🔐 Authentication Flow: Seamless sign-in/sign-up with email and password
- 📁 File Upload: Drag-and-drop video upload with progress tracking
- 🤖 AI Processing: Real-time status updates for video analysis jobs
- 📝 Metadata Editing: Interactive forms for editing AI-generated content
- 📱 Responsive Design: Mobile-first design that works on all devices
- ⚡ Performance: Optimized with code splitting and lazy loading
- Landing Page - Clear value proposition and call-to-action
- Authentication - Simple email/password auth with error handling
- Dashboard - Overview of uploaded videos and processing status
- Upload Flow - Intuitive video upload with progress feedback
- Results View - Clean interface for viewing and editing AI-generated metadata
- Profile Management - User settings and account management
- Type Safety: End-to-end type safety from database to UI components
- Error Boundaries: Graceful error handling with fallback components
- Route Protection: Automatic redirects for unauthenticated users
- Form Validation: Client-side validation with server-side verification
- Real-time Updates: Live status updates for long-running AI processing jobs
The website follows modern Next.js and tRPC patterns:
- Server Components: Leverage Next.js server components for optimal performance
- tRPC Integration: Type-safe API calls with automatic type inference
- Component Architecture: Reusable UI components from shared packages
- Authentication Flow: Seamless integration with Supabase Auth
- Error Handling: Comprehensive error boundaries with user-friendly messages
graph TD
A[Next.js Frontend] -->|tRPC calls| B[Hono + tRPC Backend]
A -->|auth| C(Supabase Auth)
B -->|queries| D[Supabase DB]
B -->|generates| E[Google Cloud Storage Signed URLs]
B -->|processes with| F[Gemini AI]
B -->|writes metadata to| D
A -->|fetches metadata from| B
G[Drizzle Schema] -->|generates| H[TypeScript Types]
H -->|used by| A
H -->|used by| B
D -->|migrations| G
- Backend: Hono + tRPC with TypeScript and Bun
- Frontend: Next.js with React and Tailwind CSS
- Database: PostgreSQL via Supabase with Drizzle ORM
- Storage: Google Cloud Storage
- AI: Google Gemini
- Auth: Supabase Auth
- Type Safety: End-to-end type safety with tRPC and Drizzle
- Build System: Turbo monorepo with Bun
- Upload Video - User uploads video file through the web interface
- Processing - Hono backend extracts audio and sends to Gemini for analysis
- AI Generation - Gemini generates title, description, transcript, and chapters
- Results - User views and can edit the generated metadata
Echo implements a database-first type system where the PostgreSQL schema is the single source of truth for all types across the entire stack.
graph LR
A[Database Schema] -->|supabase gen types| B[TypeScript Types]
B -->|imported by| C[Frontend Code]
B -->|imported by| D[Backend Code]
C -->|tRPC calls| D
D -->|Drizzle ORM| A
When you want to add new functionality:
- Think Database First - Design the tables and columns you need
- Write Migration - Create a Supabase migration file
- Apply Migration - Run the migration to update your database schema
- Generate Types - Run type generation to update TypeScript types
- Use Types - Import and use the generated types in your code
Let's say you want to add video categories:
# 1. Create migration
cd packages/supabase
supabase migration new add_video_categories
# 2. Write SQL in the generated file
# packages/supabase/migrations/[timestamp]_add_video_categories.sql-- Add to migration file
BEGIN;
CREATE TABLE public.video_categories (
id SERIAL PRIMARY KEY,
name TEXT NOT NULL UNIQUE,
description TEXT,
created_at TIMESTAMPTZ DEFAULT timezone('utc'::text, now()) NOT NULL
);
ALTER TABLE public.videos
ADD COLUMN category_id INTEGER REFERENCES public.video_categories(id);
COMMIT;# 3. Apply migration
bun db:push
# 4. Generate types
bun gen:types:db
# 5. Use in your code - types are automatically available!Frontend (TypeScript):
import { VideoCategory, Video } from "@echo/types";
// Types are automatically generated and available
const categories: VideoCategory[] = await supabase
.from("video_categories")
.select("*");Backend (TypeScript):
import { VideoCategory, Video } from "./db/schema";
import { db } from "./db/client";
// Types are automatically generated and available
export const getCategories = async (): Promise<VideoCategory[]> => {
return await db.select().from(videoCategories);
};| Command | Purpose |
|---|---|
bun gen:types:db |
Generate TypeScript types from database schema |
bun typecheck |
Type check all TypeScript code |
✅ Single Source of Truth - Database schema drives all types ✅ Automatic Consistency - No type mismatches between frontend/backend ✅ Catch Errors Early - Type errors caught at compile time ✅ Better DX - IntelliSense and autocomplete everywhere ✅ Refactoring Safety - Schema changes propagate automatically ✅ Documentation - Types serve as living documentation
The recommended workflow for adding new functionality:
- Design Database Schema - Think about what tables/columns you need
- Create Migration - Write SQL migration file
- Apply Migration - Update local database
- Generate Types - Update TypeScript types
- Implement Backend - Add API endpoints using generated types
- Implement Frontend - Add UI using generated types
- Test - Verify everything works end-to-end
# Start development environment
bun dev
# After making database changes
bun db:push && bun gen:types:db
# Type check everything
bun typecheck
# Run tests
bun test- Type Safety: All code must pass TypeScript type checking
- Linting: Use provided ESLint and Ruff configurations
- Testing: Write tests for new functionality
- Documentation: Update docs when adding new features
- Supabase Rules: Follow guidelines in
.cursor/rules/sb-*.mdcfor database development
Echo uses PostgreSQL via Supabase with the following core tables:
| Table | Purpose |
|---|---|
videos |
Stores uploaded video file information |
video_jobs |
Tracks video processing job status |
video_metadata |
Stores AI-generated metadata (titles, descriptions, etc.) |
- Create migration file:
cd packages/supabase
supabase migration new descriptive_name- Write SQL in the generated file:
-- Migration: Add new feature
-- Description: What this migration does
BEGIN;
-- Your SQL changes here
CREATE TABLE public.new_table (
id SERIAL PRIMARY KEY,
name TEXT NOT NULL,
created_at TIMESTAMPTZ DEFAULT timezone('utc'::text, now()) NOT NULL
);
COMMIT;- Apply migration:
bun db:push- Generate types:
bun gen:types:db- Always use transactions - Wrap changes in
BEGIN;andCOMMIT; - Include rollback instructions - Comment how to undo changes
- Test locally first - Apply to local database before production
- Use descriptive names - Make migration purpose clear
- Handle existing data - Consider data migration for schema changes
Echo follows strict Supabase development guidelines documented in .cursor/rules/sb-*.mdc. See Supabase Development Rules section for details.
All tables use RLS to ensure users can only access their own data:
-- Example: Users can only see their own videos
CREATE POLICY "Users can select their own videos" ON public.videos
FOR SELECT TO authenticated
USING ((SELECT auth.uid()) = uploader_user_id);| Command | Purpose |
|---|---|
bun db:start |
Start local Supabase |
bun db:stop |
Stop local Supabase |
bun db:push |
Apply migrations to database |
bun db:reset |
Reset database to clean state |
bun gen:types:db |
Generate types from schema |
Echo follows strict guidelines for Supabase development to ensure security, performance, and consistency. These rules are enforced through our development tooling in .cursor/rules/.
- Use format:
YYYYMMDDHHmmss_short_description.sql - Example:
20240906123045_create_profiles.sql
- Write all SQL in lowercase
- Include thorough comments explaining purpose and behavior
- Add copious comments for destructive operations
- Include header comment with metadata about the migration
- Always enable RLS on new tables (even for public access)
- Create granular policies (separate for SELECT, INSERT, UPDATE, DELETE)
- Specify roles explicitly (
authenticated,anon) for each policy - Include comments explaining rationale for each security policy
- Separate policies for each operation (SELECT, INSERT, UPDATE, DELETE)
- Never use
FOR ALL- create individual policies instead - Role specification - Always use
TO authenticatedorTO anon - Descriptive names - Use clear, detailed policy names in double quotes
- Use
(select auth.uid())pattern instead ofauth.uid()directly - Add indexes on columns used in policy conditions
- Minimize joins in policy expressions
- Prefer
PERMISSIVEoverRESTRICTIVEpolicies
- SELECT policies: Use
USINGclause only - INSERT policies: Use
WITH CHECKclause only - UPDATE policies: Use both
USINGandWITH CHECK - DELETE policies: Use
USINGclause only
- Default to
SECURITY INVOKERfor safer access control - Use
SECURITY DEFINERonly when explicitly required - Always set
search_path = ''to avoid security risks - Use fully qualified names (e.g.,
public.table_name)
- Use explicit input/output types
- Prefer
IMMUTABLEorSTABLEoverVOLATILEwhen possible - Include proper error handling with meaningful exceptions
- Minimize side effects - prefer functions that return results
-- Migration: Add video categories
-- Description: Creates categories table with proper RLS policies and indexes
-- Affected: videos table (adds category_id foreign key)
-- Rollback: DROP TABLE public.video_categories; ALTER TABLE public.videos DROP COLUMN category_id;
BEGIN;
-- create categories table
CREATE TABLE public.video_categories (
id SERIAL PRIMARY KEY,
name TEXT NOT NULL UNIQUE,
description TEXT,
created_at TIMESTAMPTZ DEFAULT timezone('utc'::text, now()) NOT NULL
);
-- enable rls (required for all tables)
ALTER TABLE public.video_categories ENABLE ROW LEVEL SECURITY;
-- separate policies for each operation and role
CREATE POLICY "Categories are viewable by everyone" ON public.video_categories
FOR SELECT TO authenticated, anon
USING (true);
CREATE POLICY "Categories can be created by authenticated users" ON public.video_categories
FOR INSERT TO authenticated
WITH CHECK (true);
CREATE POLICY "Categories can be updated by authenticated users" ON public.video_categories
FOR UPDATE TO authenticated
USING (true)
WITH CHECK (true);
CREATE POLICY "Categories can be deleted by authenticated users" ON public.video_categories
FOR DELETE TO authenticated
USING (true);
-- add foreign key to videos
ALTER TABLE public.videos
ADD COLUMN category_id INTEGER REFERENCES public.video_categories(id);
-- add index for performance (required for foreign keys used in policies)
CREATE INDEX idx_videos_category_id ON public.videos(category_id);
-- insert default categories
INSERT INTO public.video_categories (name, description) VALUES
('educational', 'educational content'),
('entertainment', 'entertainment content'),
('tutorial', 'how-to and tutorial content');
COMMIT;These rules are automatically enforced through:
- Cursor IDE rules - Provide guidance during development
- Code review - Manual verification of adherence to guidelines
- Migration validation - Check structure and security before deployment
| Variable | Description | Default |
|---|---|---|
SUPABASE_URL |
Supabase project URL | |
SUPABASE_ANON_KEY |
Supabase anonymous key | |
SUPABASE_SERVICE_ROLE_KEY |
Supabase service role key | |
GEMINI_API_KEY |
Google Gemini AI API key |
| Variable | Description | Default |
|---|---|---|
ENVIRONMENT |
Development/production environment | development |
STORAGE_BACKEND |
Storage backend (local or gcs) |
local |
LOCAL_STORAGE_PATH |
Path for local file storage | ./output_files |
GCS_BUCKET_NAME |
Google Cloud Storage bucket name | |
GOOGLE_APPLICATION_CREDENTIALS |
Path to GCP credentials JSON | |
REDIS_URL |
Redis connection URL for caching |
- Copy the example environment file:
cp .env.example .env- Fill in the required variables:
# Supabase (required)
SUPABASE_URL=your_supabase_url
SUPABASE_ANON_KEY=your_anon_key
SUPABASE_SERVICE_ROLE_KEY=your_service_role_key
# AI (required)
GEMINI_API_KEY=your_gemini_api_key
# Storage (optional - defaults to local)
STORAGE_BACKEND=local
LOCAL_STORAGE_PATH=./output_filesecho/
├── apps/
│ ├── api/ # Hono/tRPC Backend (TypeScript)
│ │ ├── src/
│ │ │ ├── routers/ # tRPC routers
│ │ │ ├── services/ # Business logic
│ │ │ ├── lib/ # Utilities and adapters
│ │ │ ├── db/ # Database schema and client
│ │ │ └── middleware/ # Hono middleware
│ │ └── tests/ # API tests
│ ├── website/ # Next.js Frontend
│ │ ├── src/
│ │ │ ├── app/ # Next.js app router
│ │ │ ├── components/ # React components
│ │ │ └── lib/ # Frontend utilities
│ │ └── public/ # Static assets
│ └── api/ # Simple API service
├── packages/
│ ├── supabase/ # Database configuration
│ │ ├── migrations/ # SQL migration files
│ │ └── types/ # Generated database types
│ ├── ui/ # Shared UI components
│ ├── utils/ # Shared utilities
│ └── tsconfig/ # Shared TypeScript configs
├── scripts/ # Build and utility scripts
└── README.md # Main documentation (this file)
| Command | Purpose |
|---|---|
bun dev |
Start entire development environment |
bun build |
Build all applications |
bun test |
Run all tests and quality checks |
| Command | Purpose |
|---|---|
bun db:start |
Start local Supabase |
bun db:stop |
Stop local Supabase |
bun db:push |
Apply migrations to database |
bun db:reset |
Reset database to clean state |
| Command | Purpose |
|---|---|
bun gen:types:db |
Generate types from database schema |
bun typecheck |
Type check all TypeScript code |
| Command | Purpose |
|---|---|
bun lint |
Lint all applications |
bun format |
Format all applications |
bun check |
Run all quality checks |
| Command | Purpose |
|---|---|
bun dev:web |
Frontend only |
bun dev:core |
Backend only |
- Database First - Always start with database schema design
- Type Safety - All code must pass type checking
- Supabase Rules - Follow guidelines in
.cursor/rules/sb-*.mdcfor all database work - Testing - Write tests for new functionality
- Documentation - Update docs when adding features
- Create feature branch from
main - Make your changes following the database-first workflow
- Run quality checks:
bun typecheck && bun lint && bun test - Update documentation if needed
- Submit pull request with clear description
- TypeScript: Must pass
bun typecheck - Linting: Must pass
bun lint - Testing: New features must include tests
[Add your license information here]