Skip to content

dautovri/SimulatorCamera

SimulatorCamera

Plug a real camera, a video file, or your screen into the iOS Simulator. Finally.

Swift 5.9+ Platforms SwiftPM License: MIT Release CI Sponsor

The iOS Simulator has never supported a real camera. AVCaptureDevice is empty. Every app that touches the camera β€” QR scanners, barcode readers, document capture, ML pipelines, AR prototypes β€” either stubs out the camera path, runs only on device, or ships a brittle "use a photo instead" fallback.

SimulatorCamera is a tiny two-piece developer tool that fixes it:

  • a macOS companion app that streams video frames over localhost:9876 using a compact binary protocol (SCMF β€” Simulator Camera Message Format), and
  • an iOS Swift Package with an AVCaptureSession-shaped API. On device it compiles to a no-op.

Frames show up in your app. Vision, VisionKit, Core ML, barcode detection, custom pipelines β€” the SDK is designed to drive them in the Simulator at 25–30 FPS over localhost, no device, no cables, no private APIs.

Status: v0.2.0 is a preview cut. A recorded demo and independent benchmarks will land with the first tagged release; for now, the protocol and shim are best-effort and we're actively looking for early testers.


Why

Every camera-using app today has one of these:

#if targetEnvironment(simulator)
// TODO: fake it somehow
#else
let session = AVCaptureSession()
// ...real code
#endif

This project deletes that TODO. Same API shape in the Simulator and on device.

Features

  • πŸŽ₯ Live video into the Simulator at 30 FPS via localhost TCP
  • 🧩 Drop-in SDK β€” FrameSource mirrors AVCaptureSession semantics (start(), stop(), delegate, CVPixelBuffer callbacks)
  • πŸ”Œ Sources on the Mac: test pattern (built-in), webcam, video file, screen region (roadmap)
  • πŸ“¦ One-line install via Swift Package Manager
  • πŸ›‘ No private APIs β€” Network.framework + CoreVideo + ImageIO
  • πŸ“΅ Zero overhead on device β€” #if targetEnvironment(simulator)-guarded
  • πŸ” Localhost-only by default
  • πŸ§ͺ Vision / Core ML ready β€” frames land as CVPixelBuffer

Installation

iOS SDK β€” Swift Package Manager

dependencies: [
    .package(url: "https://github.com/dautovri/SimulatorCamera.git", from: "0.2.0"),
],
targets: [
    .target(
        name: "MyApp",
        dependencies: [
            .product(name: "SimulatorCameraClient", package: "SimulatorCamera"),
        ]
    ),
]

Or in Xcode: File β†’ Add Package Dependencies… β†’ paste the repo URL.

macOS companion app

Homebrew (recommended):

brew install --cask dautovri/tap/simulatorcamera
open -a SimulatorCameraServer

Or grab the signed & notarized .dmg from Releases. Or build from source:

git clone https://github.com/dautovri/SimulatorCamera.git
cd SimulatorCamera/apps/MacServer
open SimulatorCameraServer.xcodeproj

Usage

  1. Launch SimCameraServer.app on your Mac. Pick a source and click Start.
  2. In your iOS code:
import SimulatorCameraClient

final class CameraController: NSObject, FrameSourceDelegate {
    private let source: FrameSource

    override init() {
        #if targetEnvironment(simulator)
        source = SimulatorCameraSession(host: "127.0.0.1", port: 9876)
        #else
        source = AVCaptureFrameSource() // your existing AVCapture wrapper
        #endif
        super.init()
        source.delegate = self
        source.start()
    }

    func frameSource(_ source: FrameSource, didOutput pixelBuffer: CVPixelBuffer, at time: CMTime) {
        // Feed to Vision, Core ML, preview layer, whatever.
    }
}

Full AVCaptureSession drop-in

The shim now mirrors the whole AVCaptureSession β†’ addInput β†’ addOutput β†’ startRunning dance. Your existing camera-setup code ports over by prefixing each type with Simulator:

import SimulatorCameraClient

SimulatorCamera.configure(host: "127.0.0.1", port: 9876)

let session = SimulatorCaptureSession()
session.sessionPreset = .hd1280x720

guard let device = SimulatorCaptureDevice.default(for: .video) else { return }
let input = try SimulatorCaptureDeviceInput(device: device)
session.addInput(input)

let output = SimulatorCameraOutput()          // AVCaptureVideoDataOutput-shaped
output.setSampleBufferDelegate(self, queue: frameQueue)
session.addOutput(output)

session.startRunning()                         // kicks off the network session

Your existing captureOutput(_:didOutput:from:) delegate fires with a valid CMSampleBuffer wrapping a CVPixelBuffer β€” same code path as the real device.

Zero-change AVFoundation path (recommended)

If you already have an AVCaptureVideoDataOutputSampleBufferDelegate, swap the output for SimulatorCameraOutput inside a simulator guard and keep your delegate code unchanged. The standard captureOutput(_:didOutput:from:) method fires with a real CMSampleBuffer β€” SimulatorCameraOutput is an AVCaptureVideoDataOutput subclass, so the first argument is a genuine AV output, not a stand-in:

#if targetEnvironment(simulator)
let output = SimulatorCameraOutput()
output.setSampleBufferDelegate(self, queue: myQueue)
SimulatorCamera.start()
#else
let output = AVCaptureVideoDataOutput()
output.setSampleBufferDelegate(self, queue: myQueue)
session.addOutput(output)
#endif

Or use the drop-in SwiftUI view:

import SwiftUI
import SimulatorCameraClient

struct ContentView: View {
    var body: some View {
        SimulatorCameraPreviewView()
    }
}

Protocol (SCMF)

+--------+---------------+------------------+--------+---------+----------+
| magic  | payloadLength | timestamp        | width  | height  | jpegData |
| 4 B    | 4 B uint32 LE | 8 B Float64 LE   | 4 B LE | 4 B LE  | N bytes  |
| "SCMF" |                                                                |
+--------+---------------+------------------+--------+---------+----------+

Full spec: docs/PROTOCOL.md Β· architecture: docs/ARCHITECTURE.md Β· roadmap: docs/ROADMAP.md.

Repo layout

SimulatorCamera/
β”œβ”€β”€ Package.swift                        # SwiftPM manifest (exposes SimulatorCameraClient)
β”œβ”€β”€ Sources/SimulatorCameraClient/       # the iOS SDK
β”œβ”€β”€ Tests/SimulatorCameraClientTests/    # unit tests for the SCMF codec
β”œβ”€β”€ apps/
β”‚   β”œβ”€β”€ MacServer/                       # SwiftUI macOS companion app
β”‚   └── iOSDemo/                         # sample iOS app using the SDK
β”œβ”€β”€ docs/
β”‚   β”œβ”€β”€ PROTOCOL.md                      # wire format
β”‚   β”œβ”€β”€ ARCHITECTURE.md                  # threading, transport, failure modes
β”‚   └── ROADMAP.md
β”œβ”€β”€ Casks/simulatorcamera.rb             # Homebrew cask formula
β”œβ”€β”€ scripts/
β”‚   β”œβ”€β”€ bootstrap.sh                     # swift build + test
β”‚   └── build-release.sh                 # archive + codesign + notarize + .dmg/.zip
β”œβ”€β”€ .github/
β”‚   β”œβ”€β”€ FUNDING.yml                      # GitHub Sponsors / BMC
β”‚   └── workflows/
β”‚       β”œβ”€β”€ ci.yml                       # SwiftPM CI on macos-14
β”‚       └── release.yml                  # tag-driven signed release
└── RELEASING.md                         # release runbook

Development

./scripts/bootstrap.sh   # swift build && swift test

Status

v0.2.0 β€” "Use my real camera." First stable release with a drop-in AVCaptureSession shim and live Mac webcam source. See CHANGELOG.md and docs/RELEASE_NOTES_v0.2.0.md.

Contributing

See CONTRIBUTING.md. Good first issues are labelled on the tracker. For release mechanics, see RELEASING.md.

Sponsor

SimulatorCamera is fully MIT-licensed and maintained on donations. If it saves you a device-build loop, consider sponsoring or buying a coffee. No paid tier, no license keys, no telemetry β€” just a tip jar.

License

MIT β€” see LICENSE.

About

Plug a real camera, a video file, or your screen into the iOS Simulator. Finally.

Topics

Resources

License

Code of conduct

Contributing

Stars

Watchers

Forks

Releases

No releases published

Packages

 
 
 

Contributors