RADcam is an open-source Expo + React Native iOS project that demonstrates synchronized rear-camera capture with AVCaptureMultiCamSession, real-time Metal compute processing, and RadSplat-style radial preview rendering.
No binary artifacts are included. All camera, rendering, and bridge logic is in readable Swift, Metal, and TypeScript source.
⚠️ Expo Go is not enough for this app, because this project ships custom native Swift/Metal modules. Use a custom development build (local Xcode build or EAS cloud build).
- Rear camera capability detection (single / dual / multicam fallback)
- AVCaptureMultiCamSession capture graph for synchronized frame ingestion
- Zero-copy CVPixelBuffer ➜ Metal texture conversion using
CVMetalTextureCache - Metal compute shaders for radial splatting + compositing (
.metalsource) - Live preview streamed into an Expo Module native view
- Recording pipeline via AVAssetWriter (
HEVC/H264) - Expo-compatible React Native UI in TypeScript
.
├── App.tsx
├── app.json
├── modules/
│ └── radcam-multicam/
│ ├── expo-module.config.json
│ ├── ios/
│ │ ├── CameraCapabilities.swift
│ │ ├── MultiCamManager.swift
│ │ ├── RadcamMulticamModule.swift
│ │ ├── RadcamPreviewView.swift
│ │ ├── RadSplatRenderer.swift
│ │ ├── RadSplatShaders.metal
│ │ └── VideoEncoder.swift
│ ├── package.json
│ ├── radcam-multicam.podspec
│ └── src/
│ ├── RadcamMulticamModule.ts
│ ├── RadcamMulticam.types.ts
│ └── index.ts
├── scripts/
│ ├── bootstrap.sh
│ └── run-ios.sh
└── src/
├── components/CameraScreen.tsx
├── hooks/useRadcam.ts
└── native/RadcamMulticamModule.ts
MultiCamManager builds a multi-input/multi-output session with one video output per rear lens (wide, ultra-wide, telephoto where available). The manager automatically selects the best mode:
multi: up to three rear streams whenAVCaptureMultiCamSession.isMultiCamSupporteddual: first two rear streamssingle: first rear stream
For each incoming sample buffer:
- Extract
CVPixelBuffer - Convert to
MTLTexturethroughCVMetalTextureCache - Send textures to compute kernels without CPU pixel copies
RadSplatShaders.metal applies a radial warp + glow accumulation (radialSplatKernel) and post composite (compositeKernel) to produce a single stylized preview texture.
VideoEncoder creates .mov output with real-time AVAssetWriterInputPixelBufferAdaptor and supports:
hevc(default)h264
RadcamMulticamModule.swift exposes async APIs to JS:
getCapabilitiesAsyncstartSessionAsync/stopSessionAsyncstartRecordingAsync/stopRecordingAsync
RadcamPreviewView renders Metal output inside a native view and emits frame metrics (fps, activeCameras) to React Native.
- macOS with Xcode 15+
- Node.js 18+
- CocoaPods (installed with Xcode toolchain)
- A real iOS device that supports requested camera mode (recommended)
npm install
npx expo prebuild --clean
npx expo run:ios --deviceor run the helper script:
npm run bootstrap
npm run run:iosYou can ship and test this on iPhone without owning a Mac by using EAS Build (cloud iOS builds):
- Install dependencies and sign in to Expo:
npm install
npx eas login- Trigger a cloud iOS development build:
npm run cloud:ios
# or: npx eas build --platform ios --profile development-
Install the generated development build on your iPhone (via EAS link / QR / TestFlight depending on your account setup).
-
Start Metro locally and show the dev-client QR code:
npx expo start --dev-client --tunnel- Scan the QR with your iPhone Camera and open the installed RADcam development build.
This gives you full native module support (AVFoundation + Metal) without local Xcode compilation.
Because RADcam includes native Swift/Metal code, it cannot run in Expo Go. Use an Expo development build and then launch through a QR code:
- Build/install the dev client once:
npx expo prebuild --clean
npx expo run:ios --device- Start the Metro server and generate a QR code:
npx expo start --dev-client --tunnel- On your iPhone, scan the QR code with Camera and tap the banner to open RADcam in the installed dev client.
- Install dependencies.
- Prebuild the Expo iOS project (generates
/ios). - Build a development client and deploy to a connected device.
- Grant camera permission on first launch.
- Press Start Recording to capture RadSplat-stylized session output.
- Add new Metal kernels in
modules/radcam-multicam/ios/RadSplatShaders.metal - Register additional compute pipelines in
RadSplatRenderer.swift - Wire parameters/events through
RadcamMulticamModule.swiftand TS bindings
Potential additions:
- per-lens calibration and distortion model uniforms
- temporal denoising kernel
- depth-aware compositing (LiDAR-capable devices)
- RTMP/WebRTC live stream output path
- Ensure you started Metro with dev-client mode:
npx expo start --dev-client --tunnel- In the Expo terminal UI, ensure it is showing a dev build target (not Expo Go).
- If your network is restricted, keep
--tunnelenabled.
Common reset sequence:
rm -rf ios Pods Podfile.lock
npx expo prebuild --clean
cd ios && pod install && cd ..Then rebuild the development client. If you are not on macOS, use cloud builds (npm run cloud:ios).
No. Expo Go cannot load arbitrary custom native Swift/Metal modules from this repository. Use a custom dev build (local or EAS cloud).
- iOS multi-camera performance depends on thermal and hardware limits.
- The sample currently writes one composited stream to disk for simplicity.
- If your device cannot run full multicam, the app automatically falls back to dual/single mode.
MIT