Skip to content

Camera and Microphone streaming library via RTMP, HLS for iOS, macOS and tvOS.

License

Notifications You must be signed in to change notification settings

achtungsoftware/HaishinKit.swift

 
 

Repository files navigation

HaishinKit for iOS, macOS, tvOS, and Android.

Platform Language CocoaPods GitHub license

  • Camera and Microphone streaming library via RTMP, HLS for iOS, macOS, tvOS.
  • API Documentation

Sponsored with 💖 by
Stream Chat
Enterprise Grade APIs for Feeds & Chat. Try the iOS Chat tutorial 💬

💬 Communication

  • If you need help with making LiveStreaming requests using HaishinKit, use a GitHub Discussions with Q&A.
  • If you'd like to discuss a feature request, use a GitHub Discussions with Idea
  • If you met a HaishinKit's bug🐛, use a GitHub Issue with Bug report template
    • The trace level log is very useful. Please set LBLogger.with(HaishinKitIdentifier).level = .trace.
    • If you don't use an issue template. I will immediately close the your issue without a comment.
  • If you want to contribute, submit a pull request!
  • If you want to support e-mail based communication without GitHub.
    • Consulting fee is $50/1 incident. I'm able to response a few days.
  • Discord chatroom.
  • 日本語が分かる方は日本語でお願いします!

💖 Sponsors

Streamlabs

🎨 Features

RTMP

  • Authentication
  • Publish and Recording (H264/AAC)
  • Playback (Beta)
  • Adaptive bitrate streaming
    • Handling (see also #126)
    • Automatic drop frames
  • Action Message Format
    • AMF0
    • AMF3
  • SharedObject
  • RTMPS
    • Native (RTMP over SSL/TLS)
    • Tunneled (RTMPT over SSL/TLS) (Technical Preview)
  • RTMPT (Technical Preview)
  • ReplayKit Live as a Broadcast Upload Extension

HLS

  • HTTPService
  • HLS Publish

Rendering

- HKView PiPHKView MTHKView
Engine AVCaptureVideoPreviewLayer AVSampleBufferDisplayLayer Metal
Publish
Playback ×
VisualEffect ×

Others

🌏 Requirements

- iOS OSX tvOS Xcode Swift
1.3.0+ 11.0+ 10.13+ 10.2+ 14.0+ 5.7+
1.2.0+ 9.0+ 10.11+ 10.2+ 13.0+ 5.5+

🐾 Examples

Examples project are available for iOS with UIKit, iOS with SwiftUI, macOS and tvOS.

  • Camera and microphone publish.
  • RTMP Playback
git clone https://github.com/shogo4405/HaishinKit.swift.git
cd HaishinKit.swift
carthage bootstrap --use-xcframeworks
open HaishinKit.xcodeproj

☕ Cocoa Keys

Please contains Info.plist.

iOS 10.0+

  • NSMicrophoneUsageDescription
  • NSCameraUsageDescription

macOS 10.14+

  • NSMicrophoneUsageDescription
  • NSCameraUsageDescription

🔧 Installation

CocoaPods

source 'https://github.com/CocoaPods/Specs.git'
use_frameworks!

def import_pods
    pod 'HaishinKit', '~> 1.3.0
end

target 'Your Target'  do
    platform :ios, '11.0'
    import_pods
end

Carthage

github "shogo4405/HaishinKit.swift" ~> 1.3.0

Swift Package Manager

https://github.com/shogo4405/HaishinKit.swift

💠 Donation

🔧 Prerequisites

Make sure you setup and activate your AVAudioSession.

import AVFoundation
let session = AVAudioSession.sharedInstance()
do {
    try session.setCategory(.playAndRecord, mode: .default, options: [.defaultToSpeaker, .allowBluetooth])
    try session.setActive(true)
} catch {
    print(error)
}

📓 RTMP Usage

Real Time Messaging Protocol (RTMP).

let rtmpConnection = RTMPConnection()
let rtmpStream = RTMPStream(connection: rtmpConnection)
rtmpStream.attachAudio(AVCaptureDevice.default(for: .audio)) { error in
    // print(error)
}
rtmpStream.attachCamera(AVCaptureDevice.default(.builtInWideAngleCamera, for: .video, position: .back)) { error in
    // print(error)
}

let hkView = HKView(frame: view.bounds)
hkView.videoGravity = AVLayerVideoGravity.resizeAspectFill
hkView.attachStream(rtmpStream)

// add ViewController#view
view.addSubview(hkView)

rtmpConnection.connect("rtmp://localhost/appName/instanceName")
rtmpStream.publish("streamName")
// if you want to record a stream.
// rtmpStream.publish("streamName", type: .localRecord)

RTMP URL Format

  • rtmp://server-ip-address[:port]/application/[appInstance]/[prefix:[path1[/path2/]]]streamName
    • [] mark is an Optional.
    rtmpConneciton.connect("rtmp://server-ip-address[:port]/application/[appInstance]")
    rtmpStream.publish("[prefix:[path1[/path2/]]]streamName")
    
  • rtmp://localhost/live/streamName
    rtmpConneciton.connect("rtmp://localhost/live")
    rtmpStream.publish("streamName")
    

Settings

var rtmpStream = RTMPStream(connection: rtmpConnection)

rtmpStream.captureSettings = [
    .fps: 30, // FPS
    .sessionPreset: AVCaptureSession.Preset.medium, // input video width/height
    // .isVideoMirrored: false,
    // .continuousAutofocus: false, // use camera autofocus mode
    // .continuousExposure: false, //  use camera exposure mode
    // .preferredVideoStabilizationMode: AVCaptureVideoStabilizationMode.auto
]
rtmpStream.audioSettings = [
    .muted: false, // mute audio
    .bitrate: 32 * 1000,
]
rtmpStream.videoSettings = [
    .width: 640, // video output width
    .height: 360, // video output height
    .bitrate: 160 * 1000, // video output bitrate
    .profileLevel: kVTProfileLevel_H264_Baseline_3_1, // H264 Profile require "import VideoToolbox"
    .maxKeyFrameIntervalDuration: 2, // key frame / sec
]
// "0" means the same of input
rtmpStream.recorderSettings = [
    AVMediaType.audio: [
        AVFormatIDKey: Int(kAudioFormatMPEG4AAC),
        AVSampleRateKey: 0,
        AVNumberOfChannelsKey: 0,
        // AVEncoderBitRateKey: 128000,
    ],
    AVMediaType.video: [
        AVVideoCodecKey: AVVideoCodecH264,
        AVVideoHeightKey: 0,
        AVVideoWidthKey: 0,
        /*
        AVVideoCompressionPropertiesKey: [
            AVVideoMaxKeyFrameIntervalDurationKey: 2,
            AVVideoProfileLevelKey: AVVideoProfileLevelH264Baseline30,
            AVVideoAverageBitRateKey: 512000
        ]
        */
    ],
]

// 2nd arguemnt set false
rtmpStream.attachAudio(AVCaptureDevice.default(for: .audio), automaticallyConfiguresApplicationAudioSession: false)

Authentication

var rtmpConnection = RTMPConnection()
rtmpConnection.connect("rtmp://username:password@localhost/appName/instanceName")

Screen Capture

// iOS
rtmpStream.attachScreen(ScreenCaptureSession(shared: UIApplication.shared))
// macOS
rtmpStream.attachScreen(AVCaptureScreenInput(displayID: CGMainDisplayID()))

📓 HTTP Usage

HTTP Live Streaming (HLS). Your iPhone/Mac become a IP Camera. Basic snipet. You can see http://ip.address:8080/hello/playlist.m3u8

var httpStream = HTTPStream()
httpStream.attachCamera(AVCaptureDevice.default(.builtInWideAngleCamera, for: .video, position: .back))
httpStream.attachAudio(AVCaptureDevice.default(for: .audio))
httpStream.publish("hello")

var hkView = HKView(frame: view.bounds)
hkView.attachStream(httpStream)

var httpService = HLSService(domain: "", type: "_http._tcp", name: "HaishinKit", port: 8080)
httpService.startRunning()
httpService.addHTTPStream(httpStream)

// add ViewController#view
view.addSubview(hkView)

📖 Reference

📜 License

BSD-3-Clause

About

Camera and Microphone streaming library via RTMP, HLS for iOS, macOS and tvOS.

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages

  • Swift 99.2%
  • Other 0.8%