Skip to content
This repository has been archived by the owner on Sep 15, 2024. It is now read-only.
/ MarbleKit Public archive

Swift Visual FX Metal Framework. Including HLS video processing and AudioVisual manipulation.

License

Notifications You must be signed in to change notification settings

pexavc/MarbleKit

Repository files navigation

MarbleKit (iOS/iPadOS & macOS)

Process & manipulate audio/video in realtime using Swift/SwiftUI.

Preview

The preview and this README uses my open-source application Bitsy for demonstration.

Table of Contents

Requirements

  • iOS 14+ Build passing 🟢
  • macOS 12.4+ Build passing 🟢

Installation

Build locally using XCode 14.2 or download the latest notarized build here.

Swift Packages

Guide Engine

The MarblePlayer, specifically MarbleRemote, is in itself a great guide on how to add an instance of MarbleEngine for direct effect application.

public var metalContext: MetalContext = .init()
private let marble: MarbleEngine = .init()

An engine will need MarbleKit's metalContext to be used when compiling effects onto an input texture.

Basic Usage

public var fx: [MarbleEffect] = [.godRay, ink]

let context = self.metalContext
    
//Create layers
let layers: [MarbleLayer] = fx.map {
    .init($0.getLayer(audioSample.amplitude, threshold: 1.0))
}
    
//Create the composite from an inputTexture (MTLTexture, bgra8Unorm)
let composite: MarbleComposite = .init(
    resource: .init(
        textures: .init(main: inputTexture),
        size: inputTexture),
    layers: layers)

//Compile
let compiled = marble.compile(
    fromContext: context,
    forComposite: composite)

//Return for display
if let filteredTexture = compiled.resource?.textures?.main {
    return filteredTexture
} else {
    return texture
}

Basic effect compiling flow

Tips

  • All input textures should be .bgra8Unorm ordering prior to compilation.

MTLDrawable

If you have your own MTKView to render metal textures. An easy way to render filtered textures onto your drawable for the front-end to update accordingly, is to use the downsample kernel apart of MetalContext.

metalContext
    .kernels
    .downsample
    .encode(
        commandBuffer: commandBuffer,
        inputTexture: filteredTexture,
        outputTexture: drawable.texture)

MTKViewDelegate draw callback usage with a filtered texture.

Extensions

MTLTexture -> CVPixelBuffer

texture.pixelBuffer?

CVPixelBuffer -> MTLTexture

//context is a reference to a local MetalContext if one is available
//device refers to a MTLDevice
texture(context.device, pixelFormat: .bgra8Unorm)?

MetalContext as seen in Basic Usage has a MTLDevice that can be re-used for the above logic.

Guide Player

MarblePlayer was initially inspired heavily by an amazing Open-Source player known as KSPlayer by @kingslay. Helping provide a path when dealing with HLS livestreams.

The MarblePlayerView requires a MarbleRemoteConfig to initilize playback setting. Using a MetalView as backing to render the video and audio output data automatically. Playback controls are exposed as static functions and/or variables that can be triggered from your front-end.

Player Initialization

For now, only HLS streams with .m3u playlist file links have been tested. Local video data support will come in a future update.

  1. Create a MarbleRemoteConfig
public struct MarbleRemoteConfig: Equatable, Codable, Identifiable, Hashable {
    public var id: String {
        "\(date.timeIntervalSince1970)"
    }
    
    public var date: Date = .init()
    public var name: String
    public var kind: MarbleRemoteConfig.StreamConfig.Kind
    public var streams: [StreamConfig]
    
    public init(name: String,
                kind: MarbleRemoteConfig.StreamConfig.Kind,
                streams: [StreamConfig]) {
        self.name = name
        self.kind = kind
        self.streams = streams
    }
    
    public var description: String {
        name + "'s Stream on " + kind.rawValue.capitalized
    }
    
    public func hash(into hasher: inout Hasher) {
        hasher.combine(id)
    }
    
    ...
}
  1. Set the config within a MarblePlayerView.
extension Canvas {
    public var view: some View {
        ZStack {
            MarblePlayerView(config)
        }
    }
}

Example options that can be adjusted to modify the HLS stream properties prior to initialization of the MarblePlayerView.

MarbleRemote.enableFX = true     
MarblePlayerOptions.isAutoPlay = false
MarblePlayerOptions.isSeekedAutoPlay = false
MarblePlayerOptions.preferredForwardBufferDuration = 4
MarblePlayerOptions.maxBufferDuration = 48
MarblePlayerOptions.dropVideoFrame = false
MarblePlayerOptions.forcePreferredFPS = false
MarblePlayerOptions.preferredFramesPerSecond = 60
MarblePlayerOptions.isVideoClippingEnabled = false

Changing Marble Effects

Here is a list of current FX supported. Simply adjust this static variable anywhere to change the fx yourself programmatically.

MarbleRemote.fx = [.ink]

Guide Effects

This will updated heavily to adhere to proper protocol inheritance and re-usability.

Adding a New Marble Effect

  1. All effects start as an enum param of EffectType. They can have 2 possible values in their enum's closure. The left pertaining to the loudness (should be passed in as the sound sample's decibal value). The right pertains to the threshold. So a user could potentially increase or decrease the intensity applied to the effect, a layer ontop of the passed in loudness. The threshold value should be a Float in the [0-1] range.

  2. The Effects directory gives an inside look on how each are structured and prepared prior to pipeline creation.

  3. Marble's MetalContext is the core behind filter access and initializing. After adding the kernel to the effects directory a reference should be made in similar fashion here.

  4. The MarbleEngine will then apply the filter accordingly here. Which is another location you will have to modify with your new filter.

Guide CoreML

A basic speech to text CoreML model will be added for real-time closed captioning. Proper pipelines will be put into place to allow for custom model insertions and output retrieval. Agnostic to a specific set of input and output types. Since MarbleKit is a renderer in itself, the input type will be a controlled variable, while the output will be customizable. (Image to Text, Image to Image, etc.)

WIP

TODO

  • Vague audio channel layouts not being compatible with MarblePlayer
  • Memory Leaks (Packet/Frame decoding primarily)

About

Swift Visual FX Metal Framework. Including HLS video processing and AudioVisual manipulation.

Topics

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published