Skip to content

JonathanSolvesProblems/ShotSupervisor

Repository files navigation

Shot Supervisor

A professional-grade, mobile-first AI production companion for planning, generating, editing, and exporting shots with deterministic, studio-level control.

Shot Supervisor bridges the gap between creative intent and technical execution by combining AI planning, JSON-native image generation, and professional post-production tools—directly on a phone.


Inspiration

Shot Supervisor was inspired by a real gap in professional visual production workflows: the disconnect between creative intent and technical execution.

Creators today jump between prompts, tools, exports, and manual tweaks just to keep shots consistent across a project. Prompt-based generation is powerful, but unpredictable—and it quickly breaks down when precision, repeatability, or professional standards like HDR, bit depth, and color space matter.

Bria FIBO’s JSON-native, deterministic control felt like the missing piece. Instead of fighting prompts, Shot Supervisor treats AI like a camera department—planned, controlled, repeatable.

The goal: a director-style mobile tool where shots are planned, generated, edited, and exported with production-grade fidelity—anywhere.


What It Does

Shot Supervisor is a mobile production workflow app designed for professional creative use.

The app is organized into three core tabs:

Projects

  • Create projects with a name and description
  • Optionally auto-generate up to 5 AI shots using an LLM-driven breakdown
  • Ensures stylistic and technical consistency across a project

Shots

  • Each project contains multiple shots
  • Create shots with title, description, and generation prompt
  • Generate images using Bria FIBO
  • Attach reference images from the camera roll or capture photos in-app
  • Regenerate shots while preserving intent and structure

Editor

  • Professional-grade editing tools:
    • Background removal
    • Background replacement via prompt
    • Background blur
    • Quality enhancement
  • Advanced controls:
    • HDR: exposure, contrast, highlights, shadows (custom endpoint)
    • LUTs: None, Cinema (Filmic), Teal & Orange, Vibrant
    • Color Spaces: sRGB, AdobeRGB, DCI-P3, Rec2020
    • Bit Depth: 8, 10, 12, 16-bit
    • Product editing via Bria Product API
  • Export options:
    • Save to device or email delivery
    • Formats: JPEG, PNG, TIFF, EXR

This turns a phone into a portable shot planning and post-production workstation, not just an image generator.


Bria FIBO Integration (Judges)

All calls to Bria APIs go through custom server endpoints first for postprocessing of format for application.

Example: LLM-generated shot parameters for Bria

{
  "prompt": "Studio-lit product shot of a glass bottle",
  "camera": {
    "focal_length": 85,
    "aperture": 2.8
  },
  "lighting": {
    "type": "softbox",
    "angle": "45deg"
  },
  "color": {
    "space": "DCI-P3",
    "bit_depth": 16,
    "hdr": true
  }
}

Example: Background removal API response

{
  "result": {
    "image_url": "https://d1ei2xrl63k822.cloudfront.net/api/res/88c64c12dfdb4fe9ac0e35e3eea29ed2.png?Expires=1766444665&Signature=..."
  },
  "request_id": "88c64c12dfdb4fe9ac0e35e3eea29ed2"
}

Example: Bria Product API response

{
  "result_url": "https://d1ei2xrl63k822.cloudfront.net/api/res/8a3710a5-b5a5-4467-8ac2-46158a53b171.png?Expires=1766444886&Signature=..."
}

These examples demonstrate deterministic JSON-native control, pre-processing via custom server, and real outputs from Bria APIs.

API integration code is located in:

  • /shot-supervisor-endpoints/bria-generate.ts
  • /shot-supervisor-endpoints/bria-edit.ts
  • /shot-supervisor-endpoints/bria-product.ts

How It’s Built

Core Technologies

  • Expo / React Native (mobile-first UX)
  • Bria FIBO APIs
    • Image generation
    • Editing APIs
    • Product APIs
  • Supabase
    • Authentication
    • Projects & shots database

Backend Architecture

Custom Vercel-hosted server endpoints (for security, flexibility, and extensibility):

  • bria-generate
  • bria-edit
  • bria-product
  • export-hdr
  • hdr-tone-map
  • email
  • llm-breakdown (uses DeepSeek-V3 to convert project descriptions into structured shot plans with bria)

This modular approach keeps generation, editing, HDR processing, and orchestration scalable and production-ready.


Challenges

  • Designing a professional editing workflow on mobile without overwhelming users
  • Balancing creative freedom with structured, deterministic control
  • Integrating HDR and high bit-depth workflows in an intuitive way
  • Building custom HDR endpoints beyond default API constraints
  • Hackathon time limits—some components are present but not fully polished

Accomplishments

  • Built a true professional-grade mobile tool, not a demo
  • Showcased Bria FIBO’s strengths:
    • JSON-native control
    • Deterministic generation
    • High bit depth & HDR workflows
  • Combined LLM-based planning with structured image generation
  • Created a scalable, secure backend architecture
  • Delivered a full pipeline: concept → generation → edit → export

What I Learned

  • Professionals don’t want better prompts—they want better controls
  • Structured generation unlocks workflows prompt-only models can’t support
  • HDR, color space, and bit depth are critical in real production
  • Mobile can be a serious creative platform
  • JSON-native generation pairs extremely well with agentic pipelines

What’s Next

Planned improvements:

  • Real-time HDR preview
  • Shot consistency auditing
  • Histogram-based editing
  • MCP-powered Bria conversational edits
  • Video support
  • Refactor env handling (move all secrets server-side)
  • Particle-based splash screen polish

Some features are visible in the codebase but unfinished due to hackathon constraints.


Running the Project Locally (Judges)

Prerequisites

  • Node.js (18+ recommended)
  • Expo CLI
  • Android Studio (for emulator) or a physical Android device

1. Install Dependencies

npm install

2. Environment Variables (Client)

Create a .env file at the root of the project:

EXPO_PUBLIC_SUPABASE_URL=your_supabase_url
EXPO_PUBLIC_SUPABASE_ANON_KEY=your_supabase_anon_key
EXPO_PUBLIC_VERCEL_BASE_URL=https://shot-supervisor.vercel.app

EXPO_PUBLIC_VERCEL_BASE_URL points to the hosted server endpoints used by the app.

3. Run a Development Build (Recommended)

eas build --profile development --platform android

Once installed on an emulator or device:

npx expo start --dev-client -c

4. Server Endpoints (Optional – Self Hosting)

If you want to run the backend locally, navigate to:

/shot-supervisor-endpoints

Create a .env file with:

SUPABASE_URL=your_supabase_url
SUPABASE_ANON_KEY=your_supabase_anon_key
BRIA_API_TOKEN=your_bria_api_token
BRIA_API_BASE=your_bria_api_base
GITHUB_TOKEN=optional_for_repo_features
SUPABASE_SMTP_PASS=optional_for_email

Deploy these endpoints to Vercel or run locally depending on your setup.

Notes for Judges

  • Some API keys (notably Bria) may expire due to free trial limits
  • If you encounter any setup or runtime issues, please contact me—I’m happy to provide:
    • Temporary API keys
    • A live demo walkthrough
    • Clarification on any part of the architecture

This project is best evaluated as a production workflow system, not just an AI demo.

License

MIT

About

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published