HaishinKit for iOS, macOS, tvOS, and Android.
- Camera and Microphone streaming library via RTMP, HLS for iOS, macOS, tvOS.
- API Documentation
Sponsored with 💖 by
Enterprise Grade APIs for Feeds & Chat. Try the iOS Chat tutorial 💬
- If you need help with making LiveStreaming requests using HaishinKit, use a GitHub Discussions with Q&A.
- If you'd like to discuss a feature request, use a GitHub Discussions with Idea
- If you met a HaishinKit's bug🐛, use a GitHub Issue with Bug report template
- The trace level log is very useful. Please set
LBLogger.with(HaishinKitIdentifier).level = .trace
. - If you don't use an issue template. I will immediately close the your issue without a comment.
- The trace level log is very useful. Please set
- If you want to contribute, submit a pull request!
- If you want to support e-mail based communication without GitHub.
- Consulting fee is $50/1 incident. I'm able to response a few days.
- Discord chatroom.
- 日本語が分かる方は日本語でお願いします!
- Authentication
- Publish and Recording (H264/AAC)
- Playback (Beta)
- Adaptive bitrate streaming
- Handling (see also #126)
- Automatic drop frames
- Action Message Format
- AMF0
- AMF3
- SharedObject
- RTMPS
- Native (RTMP over SSL/TLS)
- Tunneled (RTMPT over SSL/TLS) (Technical Preview)
- RTMPT (Technical Preview)
- ReplayKit Live as a Broadcast Upload Extension
- HTTPService
- HLS Publish
- | HKView | PiPHKView | MTHKView |
---|---|---|---|
Engine | AVCaptureVideoPreviewLayer | AVSampleBufferDisplayLayer | Metal |
Publish | ○ | ◯ | ○ |
Playback | × | ◯ | ○ |
VisualEffect | × | ◯ | ○ |
- Support tvOS 11.0+ (Technical Preview)
- tvOS can't publish Camera and Microphone. Available playback feature.
- Hardware acceleration for H264 video encoding, AAC audio encoding
- Support "Allow app extension API only" option
-
Support GPUImage framework (~> 0.5.12) -
Objective-C Bridging
- | iOS | OSX | tvOS | Xcode | Swift |
---|---|---|---|---|---|
1.3.0+ | 11.0+ | 10.13+ | 10.2+ | 14.0+ | 5.7+ |
1.2.0+ | 9.0+ | 10.11+ | 10.2+ | 13.0+ | 5.5+ |
Examples project are available for iOS with UIKit, iOS with SwiftUI, macOS and tvOS.
- Camera and microphone publish.
- RTMP Playback
git clone https://github.com/shogo4405/HaishinKit.swift.git
cd HaishinKit.swift
carthage bootstrap --use-xcframeworks
open HaishinKit.xcodeproj
Please contains Info.plist.
iOS 10.0+
- NSMicrophoneUsageDescription
- NSCameraUsageDescription
macOS 10.14+
- NSMicrophoneUsageDescription
- NSCameraUsageDescription
source 'https://github.com/CocoaPods/Specs.git'
use_frameworks!
def import_pods
pod 'HaishinKit', '~> 1.3.0
end
target 'Your Target' do
platform :ios, '11.0'
import_pods
end
github "shogo4405/HaishinKit.swift" ~> 1.3.0
https://github.com/shogo4405/HaishinKit.swift
- GitHub Sponsors
- Paypal
Make sure you setup and activate your AVAudioSession.
import AVFoundation
let session = AVAudioSession.sharedInstance()
do {
try session.setCategory(.playAndRecord, mode: .default, options: [.defaultToSpeaker, .allowBluetooth])
try session.setActive(true)
} catch {
print(error)
}
Real Time Messaging Protocol (RTMP).
let rtmpConnection = RTMPConnection()
let rtmpStream = RTMPStream(connection: rtmpConnection)
rtmpStream.attachAudio(AVCaptureDevice.default(for: .audio)) { error in
// print(error)
}
rtmpStream.attachCamera(AVCaptureDevice.default(.builtInWideAngleCamera, for: .video, position: .back)) { error in
// print(error)
}
let hkView = HKView(frame: view.bounds)
hkView.videoGravity = AVLayerVideoGravity.resizeAspectFill
hkView.attachStream(rtmpStream)
// add ViewController#view
view.addSubview(hkView)
rtmpConnection.connect("rtmp://localhost/appName/instanceName")
rtmpStream.publish("streamName")
// if you want to record a stream.
// rtmpStream.publish("streamName", type: .localRecord)
- rtmp://server-ip-address[:port]/application/[appInstance]/[prefix:[path1[/path2/]]]streamName
- [] mark is an Optional.
rtmpConneciton.connect("rtmp://server-ip-address[:port]/application/[appInstance]") rtmpStream.publish("[prefix:[path1[/path2/]]]streamName")
- rtmp://localhost/live/streamName
rtmpConneciton.connect("rtmp://localhost/live") rtmpStream.publish("streamName")
var rtmpStream = RTMPStream(connection: rtmpConnection)
rtmpStream.captureSettings = [
.fps: 30, // FPS
.sessionPreset: AVCaptureSession.Preset.medium, // input video width/height
// .isVideoMirrored: false,
// .continuousAutofocus: false, // use camera autofocus mode
// .continuousExposure: false, // use camera exposure mode
// .preferredVideoStabilizationMode: AVCaptureVideoStabilizationMode.auto
]
rtmpStream.audioSettings = [
.muted: false, // mute audio
.bitrate: 32 * 1000,
]
rtmpStream.videoSettings = [
.width: 640, // video output width
.height: 360, // video output height
.bitrate: 160 * 1000, // video output bitrate
.profileLevel: kVTProfileLevel_H264_Baseline_3_1, // H264 Profile require "import VideoToolbox"
.maxKeyFrameIntervalDuration: 2, // key frame / sec
]
// "0" means the same of input
rtmpStream.recorderSettings = [
AVMediaType.audio: [
AVFormatIDKey: Int(kAudioFormatMPEG4AAC),
AVSampleRateKey: 0,
AVNumberOfChannelsKey: 0,
// AVEncoderBitRateKey: 128000,
],
AVMediaType.video: [
AVVideoCodecKey: AVVideoCodecH264,
AVVideoHeightKey: 0,
AVVideoWidthKey: 0,
/*
AVVideoCompressionPropertiesKey: [
AVVideoMaxKeyFrameIntervalDurationKey: 2,
AVVideoProfileLevelKey: AVVideoProfileLevelH264Baseline30,
AVVideoAverageBitRateKey: 512000
]
*/
],
]
// 2nd arguemnt set false
rtmpStream.attachAudio(AVCaptureDevice.default(for: .audio), automaticallyConfiguresApplicationAudioSession: false)
var rtmpConnection = RTMPConnection()
rtmpConnection.connect("rtmp://username:password@localhost/appName/instanceName")
// iOS
rtmpStream.attachScreen(ScreenCaptureSession(shared: UIApplication.shared))
// macOS
rtmpStream.attachScreen(AVCaptureScreenInput(displayID: CGMainDisplayID()))
HTTP Live Streaming (HLS). Your iPhone/Mac become a IP Camera. Basic snipet. You can see http://ip.address:8080/hello/playlist.m3u8
var httpStream = HTTPStream()
httpStream.attachCamera(AVCaptureDevice.default(.builtInWideAngleCamera, for: .video, position: .back))
httpStream.attachAudio(AVCaptureDevice.default(for: .audio))
httpStream.publish("hello")
var hkView = HKView(frame: view.bounds)
hkView.attachStream(httpStream)
var httpService = HLSService(domain: "", type: "_http._tcp", name: "HaishinKit", port: 8080)
httpService.startRunning()
httpService.addHTTPStream(httpStream)
// add ViewController#view
view.addSubview(hkView)
- Adobe’s Real Time Messaging Protocol
- Action Message Format -- AMF 0
- Action Message Format -- AMF 3
- Video File Format Specification Version 10
- Adobe Flash Video File Format Specification Version 10.1
BSD-3-Clause