Skip to content
Camera and Microphone streaming library via RTMP, HLS for iOS, macOS, tvOS.
Swift Ruby Other
Branch: master
Clone or download
Latest commit 9798f95 Nov 4, 2019
Permalink
Type Name Latest commit message Commit time
Failed to load latest commit information.
.github
Examples swiftlint autocorrect Oct 12, 2019
HaishinKit.xcodeproj add enum RTMPObjectEncoding Sep 29, 2019
Platforms Remove syncOrientation property. Sep 28, 2019
Sources Set correct objectEncoding on RTMPDataMessage on other init method, too Nov 4, 2019
Tests s/[: ]/[:]/g Oct 14, 2019
docs jazzy♩ Sep 29, 2019
soe
.gitignore jazzy Oct 27, 2018
.ruby-version Update Gemfiles Sep 12, 2019
.swiftlint.yml swiftlint autocorrect +α Jan 4, 2019
Cartfile Update README.md Sep 22, 2019
Gemfile Update Gemfiles Sep 12, 2019
Gemfile.lock
HaishinKit.podspec Bump to up 1.0.0 Sep 28, 2019
LICENSE.md Update LICENSE.md Nov 20, 2017
Package.swift Bump to up 1.0.0 Sep 28, 2019
README.md Bump to up 1.0.0 Sep 28, 2019

README.md

HaishinKit (formerly lf)

Platform Language CocoaPods GitHub license

  • Camera and Microphone streaming library via RTMP, HLS for iOS, macOS, tvOS.
  • Issuesの言語は、英語か、日本語でお願いします!

Features

RTMP

  • Authentication
  • Publish and Recording (H264/AAC)
  • Playback (Beta)
  • Adaptive bitrate streaming
    • Handling (see also #126)
    • Automatic drop frames
  • Action Message Format
    • AMF0
    • AMF3
  • SharedObject
  • RTMPS
    • Native (RTMP over SSL/TLS)
    • Tunneled (RTMPT over SSL/TLS) (Technical Preview)
  • RTMPT (Technical Preview)
  • ReplayKit Live as a Broadcast Upload Extension (Technical Preview)

HLS

  • HTTPService
  • HLS Publish

Rendering

- HKView GLHKView MTHKView
Engine AVCaptureVideoPreviewLayer OpenGL ES Metal
Publish
Playback ×
VIsualEffect ×
Condition Stable Stable Beta

Others

  • Support tvOS 10.2+ (Technical Preview)
    • tvOS can't publish Camera and Microphone. Available playback feature.
  • Hardware acceleration for H264 video encoding, AAC audio encoding
  • Support "Allow app extension API only" option
  • Support GPUImage framework (~> 0.5.12)
  • Objective-C Bridging

Requirements

- iOS OSX tvOS XCode Swift CocoaPods Carthage
0.11.0+ 8.0+ 10.11+ 10.2+ 10.0+ 5.0 1.5.0+ 0.29.0+
0.10.0+ 8.0+ 10.11+ 10.2+ 10.0+ 4.2 1.5.0+ 0.29.0+

Cocoa Keys

Please contains Info.plist.

iOS 10.0+

  • NSMicrophoneUsageDescription
  • NSCameraUsageDescription

macOS 10.14+

  • NSMicrophoneUsageDescription
  • NSCameraUsageDescription

Installation

*Please set up your project Swift 5.0. *

CocoaPods

source 'https://github.com/CocoaPods/Specs.git'
use_frameworks!

def import_pods
    pod 'HaishinKit', '~> 1.0.0'
end

target 'Your Target'  do
    platform :ios, '8.0'
    import_pods
end

Carthage

github "shogo4405/HaishinKit.swift" ~> 1.0.0

License

BSD-3-Clause

Donation

Paypal

Bitcoin

1LP7Jo4VwAFdEisJSykBAtUyAusZjozSpw

Prerequisites

Make sure you setup and activate your AVAudioSession.

import AVFoundation
let session = AVAudioSession.sharedInstance()
do {
    try session.setPreferredSampleRate(44_100)
    // https://stackoverflow.com/questions/51010390/avaudiosession-setcategory-swift-4-2-ios-12-play-sound-on-silent
    if #available(iOS 10.0, *) {
        try session.setCategory(.playAndRecord, mode: .default, options: [.defaultToSpeaker, .allowBluetooth])
    } else {
        session.perform(NSSelectorFromString("setCategory:withOptions:error:"), with: AVAudioSession.Category.playAndRecord, with:  [AVAudioSession.CategoryOptions.allowBluetooth])
    }
    try session.setMode(AVAudioSessionModeDefault)
    try session.setActive(true)
} catch {
}

RTMP Usage

Real Time Messaging Protocol (RTMP).

let rtmpConnection = RTMPConnection()
let rtmpStream = RTMPStream(connection: rtmpConnection)
rtmpStream.attachAudio(AVCaptureDevice.default(for: AVMediaType.audio)) { error in
    // print(error)
}
rtmpStream.attachCamera(DeviceUtil.device(withPosition: .back)) { error in
    // print(error)
}

let hkView = HKView(frame: view.bounds)
hkView.videoGravity = AVLayerVideoGravity.resizeAspectFill
hkView.attachStream(rtmpStream)

// add ViewController#view
view.addSubview(hkView)

rtmpConnection.connect("rtmp://localhost/appName/instanceName")
rtmpStream.publish("streamName")
// if you want to record a stream.
// rtmpStream.publish("streamName", type: .localRecord)

Settings

let sampleRate:Double = 44_100

// see: #58
#if(iOS)
let session = AVAudioSession.sharedInstance()
do {
    try session.setPreferredSampleRate(44_100)
    // https://stackoverflow.com/questions/51010390/avaudiosession-setcategory-swift-4-2-ios-12-play-sound-on-silent
    if #available(iOS 10.0, *) {
        try session.setCategory(.playAndRecord, mode: .default, options: [.allowBluetooth])
    } else {
        session.perform(NSSelectorFromString("setCategory:withOptions:error:"), with: AVAudioSession.Category.playAndRecord, with:  [AVAudioSession.CategoryOptions.allowBluetooth])
    }
    try session.setActive(true)
} catch {
}
#endif

var rtmpStream = RTMPStream(connection: rtmpConnection)

rtmpStream.captureSettings = [
    .fps: 30, // FPS
    .sessionPreset: AVCaptureSession.Preset.medium, // input video width/height
    .continuousAutofocus: false, // use camera autofocus mode
    .continuousExposure: false, //  use camera exposure mode
    // .preferredVideoStabilizationMode: AVCaptureVideoStabilizationMode.auto
]
rtmpStream.audioSettings = [
    .muted: false, // mute audio
    .bitrate: 32 * 1000,
    .sampleRate: sampleRate, 
]
rtmpStream.videoSettings = [
    .width: 640, // video output width
    .height: 360, // video output height
    .bitrate: 160 * 1000, // video output bitrate
    .profileLevel: kVTProfileLevel_H264_Baseline_3_1, // H264 Profile require "import VideoToolbox"
    .maxKeyFrameIntervalDuration: 2, // key frame / sec
]
// "0" means the same of input
rtmpStream.recorderSettings = [
    AVMediaType.audio: [
        AVFormatIDKey: Int(kAudioFormatMPEG4AAC),
        AVSampleRateKey: 0,
        AVNumberOfChannelsKey: 0,
        // AVEncoderBitRateKey: 128000,
    ],
    AVMediaType.video: [
        AVVideoCodecKey: AVVideoCodecH264,
        AVVideoHeightKey: 0,
        AVVideoWidthKey: 0,
        /*
        AVVideoCompressionPropertiesKey: [
            AVVideoMaxKeyFrameIntervalDurationKey: 2,
            AVVideoProfileLevelKey: AVVideoProfileLevelH264Baseline30,
            AVVideoAverageBitRateKey: 512000
        ]
        */
    ],
]

// 2nd arguemnt set false
rtmpStream.attachAudio(AVCaptureDevice.default(for: AVMediaType.audio), automaticallyConfiguresApplicationAudioSession: false)

Authentication

var rtmpConnection = RTMPConnection()
rtmpConnection.connect("rtmp://username:password@localhost/appName/instanceName")

Screen Capture

// iOS
rtmpStream.attachScreen(ScreenCaptureSession(shared: UIApplication.shared))
// macOS
rtmpStream.attachScreen(AVCaptureScreenInput(displayID: CGMainDisplayID()))

HTTP Usage

HTTP Live Streaming (HLS). Your iPhone/Mac become a IP Camera. Basic snipet. You can see http://ip.address:8080/hello/playlist.m3u8

var httpStream = HTTPStream()
httpStream.attachCamera(DeviceUtil.device(withPosition: .back))
httpStream.attachAudio(AVCaptureDevice.defaultDevice(withMediaType: AVMediaTypeAudio))
httpStream.publish("hello")

var hkView = HKView(frame: view.bounds)
hkView.attachStream(httpStream)

var httpService = HLSService(domain: "", type: "_http._tcp", name: "HaishinKit", port: 8080)
httpService.startRunning()
httpService.addHTTPStream(httpStream)

// add ViewController#view
view.addSubview(hkView)

FAQ

How can I run example project?

Please hit carthage update command. HaishinKit needs Logboard module via Carthage.

carthage update

Do you support me via Email?

Yes. Consulting fee is $50/1 incident. I don't recommend. Please consider to use Issues.

Reference

You can’t perform that action at this time.