Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

How to use ScreenCaptureSession to live broadcast screen capture? #28

Closed
pzs7602 opened this issue Apr 20, 2016 · 5 comments
Closed

How to use ScreenCaptureSession to live broadcast screen capture? #28

pzs7602 opened this issue Apr 20, 2016 · 5 comments

Comments

@pzs7602
Copy link

pzs7602 commented Apr 20, 2016

please give me some code snippet,thanks!

@shogo4405
Copy link
Owner

https://github.com/shogo4405/lf.swift/blob/master/Application/Application/LiveViewController.swift#L99

// 1st step:  Comment-out line
// rtmpStream.attachCamera(AVMixer.deviceWithPosition(.Back))
// 2nd step: Remove Comment-out Line
rtmpStream.attachScreen(ScreenCaptureSession())

@pzs7602
Copy link
Author

pzs7602 commented Apr 21, 2016

thanks! it works. can rtmpStream attach both camera and screen capture?

@shogo4405
Copy link
Owner

library has no this feature.
I think can create this feature. custom VIsualEffect Plugin.

// example Custom VisualEffect
final class CameraMixEffect: VisualEffect, AVCaptureVideoDataOutputSampleBufferDelegate {
    let filter:CIFilter? = CIFilter(name: "CISourceOverCompositing")
    var camera:CIImage?
    var lockQueue:dispatch_queue_t? dispatch_queue_create(
         "CameraMixEffect.lock", DISPATCH_QUEUE_SERIAL
   )

    override init() {
        super.init()
    }

    override func execute(image: CIImage) -> CIImage {
        guard let filter:CIFilter = filter else {
            return image
        }
        dispatch_async(lockQueue) {
            self.filter.setValue(camera, forKey: "inputImage")
            self.filter.setValue(image, forKey: "inputBackgroundImage")
        }
        return filter.outputImage!
    }

    func captureOutput(captureOutput: AVCaptureOutput!, didOutputSampleBuffer sampleBuffer: CMSampleBuffer!, fromConnection connection: AVCaptureConnection!) {
        guard let image:CVImageBufferRef = CMSampleBufferGetImageBuffer(sampleBuffer) else {
            return
        }
        camera = CIImage(CVPixelBuffer:  image)
    }
}

// next 
var session = AVCaptureSession()
var videoOutput =  AVCaptureVideoDataOutput()
session.addInput(try! AVCaptureDeviceInput(device: camera))
session.addOutput(videoDataOutput)

var effect = CameraMixEffect()
videoOutput.setSampleBufferDelegate(effect, queue: effect.lockQueue)

var stream:RTMPStream = RTMPStream()
stream.registerEffect(effect)

@pzs7602
Copy link
Author

pzs7602 commented Apr 26, 2016

I try this, but the camera image does not shown over the device screen, perhaps the camera image's position or size must be set properly?
I can get the image data from didOutputSampleBuffer method and display it on the screen, that also what I need.
anyway, thank you very much!

@tatuanfpt
Copy link

Is this solution still work?
Consider about adding some UIView from screen as a layer to Device Camera for a better quality in comparison with recording screen (which result in high CPU + low FPS).

@github-actions github-actions bot locked as resolved and limited conversation to collaborators Apr 25, 2024
Sign up for free to subscribe to this conversation on GitHub. Already have an account? Sign in.
Labels
None yet
Projects
None yet
Development

No branches or pull requests

3 participants