Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Timer Label Overlay #43

Closed
MelnykovDenys opened this issue Apr 19, 2022 · 10 comments
Closed

Timer Label Overlay #43

MelnykovDenys opened this issue Apr 19, 2022 · 10 comments

Comments

@MelnykovDenys
Copy link

Hi!
Thanks for the awesome Utilities!

Could you clarify me please how I can add a timer-label (for ex.: UILabel with a special color and font) on top of the camera , update the text (for ex., every second), add UIImage and eventually record a video?

@MelnykovDenys MelnykovDenys changed the title Timer Label Timer Label Overlay Apr 19, 2022
@YuAo
Copy link
Member

YuAo commented Apr 20, 2022

Can you be more specific? Are you asking how to add UI to the recorder view or how to record some UI elements to the video?

@MelnykovDenys
Copy link
Author

MelnykovDenys commented Apr 20, 2022

Yes, add an overlay view (UI element) to the recorder view and get at the end video-recording from the camera and overlay elements on top

@YuAo
Copy link
Member

YuAo commented Apr 20, 2022

If the UI / Video clock sync is not your concern, you can use this utility to snapshot your view's content into a CVPixelBuffer.

You can then use this CVPixelBuffer to create a MTIImage and use the MTIImage as an overlay of the video image. You can refer to the "CameraFilterView" and "MultilayerCompositingFilterView" examples here https://github.com/MetalPetal/MetalPetal

/// Snapshot a view using a CGContext that is backed by a CVPixelBuffer from a CVPixelBufferPool.
class ViewSnapshoter {
    private var pixelBufferPool: MTICVPixelBufferPool?
    
    enum Error: String, LocalizedError {
        case cannotCreateCGContext
    }
    
    func snapshot(_ view: UIView, afterScreenUpdates: Bool, renderScale: CGFloat = 2) throws -> CVPixelBuffer {
        let renderWidth = Int(view.frame.width * renderScale)
        let renderHeight = Int(view.frame.height * renderScale)
        let pool: MTICVPixelBufferPool
        if let pixelBufferPool = pixelBufferPool {
            if pixelBufferPool.pixelBufferWidth == renderWidth && pixelBufferPool.pixelBufferHeight == renderHeight {
                pool = pixelBufferPool
            } else {
                pool = try MTICVPixelBufferPool(pixelBufferWidth: renderWidth, pixelBufferHeight: renderHeight, pixelFormatType: kCVPixelFormatType_32BGRA, minimumBufferCount: 16)
                self.pixelBufferPool = pool
            }
        } else {
            pool = try MTICVPixelBufferPool(pixelBufferWidth: renderWidth, pixelBufferHeight: renderHeight, pixelFormatType: kCVPixelFormatType_32BGRA, minimumBufferCount: 16)
            self.pixelBufferPool = pool
        }
        let pixelBuffer = try pool.makePixelBuffer(allocationThreshold: 16)
        CVPixelBufferLockBaseAddress(pixelBuffer, [])
        defer {
            CVPixelBufferUnlockBaseAddress(pixelBuffer, [])
        }
        guard let cgContext = CGContext(data: CVPixelBufferGetBaseAddress(pixelBuffer), width: renderWidth, height: renderHeight, bitsPerComponent: 8, bytesPerRow: CVPixelBufferGetBytesPerRow(pixelBuffer), space: CGColorSpaceCreateDeviceRGB(), bitmapInfo: CGBitmapInfo.byteOrder32Little.rawValue | CGImageAlphaInfo.premultipliedFirst.rawValue /* 32-bit BGRA */) else {
            throw Error.cannotCreateCGContext
        }
        
        // Apply transform for UIKit coordinate system.
        cgContext.concatenate(CGAffineTransform(translationX: 0, y: CGFloat(renderHeight)))
        cgContext.concatenate(CGAffineTransform(scaleX: renderScale, y: -renderScale))
        
        UIGraphicsPushContext(cgContext)
        view.drawHierarchy(in: view.bounds, afterScreenUpdates: afterScreenUpdates)
        UIGraphicsPopContext()
        
        return pixelBuffer
    }
}

@MelnykovDenys
Copy link
Author

thanks for the answer!
I've tried this method but I'm getting an error: "Unexpected asset record status". when I try to save

I will try your method

import UIKit
import VideoIO
import MetalPetal
import AVFoundation
import SnapKit
import RxSwift
import RxCocoa

class MetalPetalViewController: UIViewController {
    
    private let filter = MTIBlendFilter(blendMode: .overlay)
    private var recorder: MovieRecorder?
    private var camera: Camera!

    private let queue: DispatchQueue = DispatchQueue(label: "org.metalpetal.capture")
    private var isRecording = BehaviorRelay<Bool>(value: false)
    private let disposeBag = DisposeBag()
    
    private var cameraOutputView = MTIImageView()
    private let recordButton = UIButton()
    private let overlayView = UIView()
    private let timerLabel = UILabel()
    
    override func viewDidLoad() {
        super.viewDidLoad()
        setupLayout()
        setupCamera()
        
        recordButton.rx.tap
            .bind {
                self.isRecording.value ? self.stopRecording() : self.startRecording()
            }.disposed(by: disposeBag)
        
        isRecording.map { $0 ? UIColor.blue : UIColor.red }
            .bind(to: recordButton.rx.backgroundColor)
            .disposed(by: disposeBag)
        
        let countDown = 100
        Observable<Int>.timer(.seconds(0), period: .seconds(1), scheduler: MainScheduler.instance)
            .take(countDown + 1)
            .subscribe(onNext: { timePassed in
                self.timerLabel.text = "\(timePassed)"
            }, onCompleted: {
                print("count down complete")
            }).disposed(by: disposeBag)
    }
    
    override func viewDidAppear(_ animated: Bool) {
        super.viewDidAppear(animated)
        startRunningCaptureSession()
    }
    
    override func viewDidDisappear(_ animated: Bool) {
        super.viewDidDisappear(animated)
        stopRunningCaptureSession()
    }
    
    private func setupLayout() {
        view.addSubview(cameraOutputView)
        cameraOutputView.snp.makeConstraints {
            $0.edges.equalToSuperview()
        }
        
        overlayView.backgroundColor = .clear
        view.addSubview(overlayView)
        overlayView.snp.makeConstraints {
            $0.edges.equalToSuperview()
        }
        
        timerLabel.textColor = .white
        timerLabel.font = .systemFont(ofSize: 20)
        overlayView.addSubview(timerLabel)
        timerLabel.snp.makeConstraints {
            $0.center.equalToSuperview()
        }
        
        recordButton.layer.cornerRadius = 25
        view.addSubview(recordButton)
        recordButton.snp.makeConstraints {
            $0.size.equalTo(50)
            $0.bottom.equalToSuperview().inset(60)
            $0.centerX.equalToSuperview()
        }
    }
    
    private func setupCamera() {
        self.camera = Camera(captureSessionPreset: .hd1920x1080, configurator: .portraitFrontMirroredVideoOutput)
        try? camera.enableVideoDataOutput(on: queue, delegate: self)
        try? camera.enableAudioDataOutput(on: queue, delegate: self)
        
        camera.videoDataOutput?.videoSettings = [kCVPixelBufferPixelFormatTypeKey as String: kCVPixelFormatType_420YpCbCr8BiPlanarFullRange]
    }
    
    private func startRunningCaptureSession() {
        queue.async {
            self.camera.startRunningCaptureSession()
        }
    }
    
    private func stopRunningCaptureSession() {
        queue.async {
            self.camera.stopRunningCaptureSession()
        }
    }
    
    private func startRecording() {
        let sessionID = UUID()
        let url = FileManager.default.temporaryDirectory.appendingPathComponent("\(sessionID.uuidString).mp4")
        let hasAudio = self.camera.audioDataOutput != nil
        do {
            let recorder = try MovieRecorder(url: url, configuration: MovieRecorder.Configuration(hasAudio: hasAudio))
            self.isRecording.accept(true)
            queue.async {
                self.recorder = recorder
            }
        } catch {
            handleError(error)
        }
    }
    
    private func stopRecording() {
        if let recorder = recorder {
            recorder.stopRecording(completion: { error in
                self.isRecording.accept(false)
                if let error = error {
                    self.handleError(error)
                } else {
                    self.handleFinishRecording(videoURL: recorder.url)
                }
            })
            queue.async {
                self.recorder = nil
            }
        }
    }
    
    private func handleFinishRecording(videoURL: URL) {
        navigationController?.pushViewController(VideoPlayerVC(url: videoURL), animated: true)
    }
    
    private func handleError(_ error: Error) {
        let alert = UIAlertController(title: "Error",
                                      message: error.localizedDescription,
                                      preferredStyle: .alert)
        alert.addAction(.init(title: "OK", style: .cancel))
        present(alert, animated: true)
    }
}

extension MetalPetalViewController: AVCaptureVideoDataOutputSampleBufferDelegate, AVCaptureAudioDataOutputSampleBufferDelegate {
    func captureOutput(_ output: AVCaptureOutput, didOutput sampleBuffer: CMSampleBuffer, from connection: AVCaptureConnection) {
        guard let formatDescription = sampleBuffer.formatDescription else {
            return
        }
        switch formatDescription.mediaType {
        case .audio:
            do {
                try self.recorder?.appendSampleBuffer(sampleBuffer)
            } catch {
                print(error)
            }
        case .video:
            guard let pixelBuffer = CMSampleBufferGetImageBuffer(sampleBuffer) else { return }
            filter.inputBackgroundImage = MTIImage(cvPixelBuffer: pixelBuffer,
                                                   alphaType: .nonPremultiplied)
            let inputImage = overlayView.asImage()
            filter.inputImage = .init(image: inputImage)
            DispatchQueue.main.async {
                self.cameraOutputView.image = self.filter.outputImage
            }
        default:
            break
        }
    }
}

extension UIView {
    func asImage() -> UIImage {
        let renderer = UIGraphicsImageRenderer(bounds: bounds)
        return renderer.image { rendererContext in
            layer.render(in: rendererContext.cgContext)
        }
    }
}

@YuAo
Copy link
Member

YuAo commented Apr 20, 2022

I can't see you append any video buffer to the recorder. The internal writer cannot start without receiving video buffers so "Unexpected status".

Also I don't think you want to use the Overlay blend mode.

@MelnykovDenys
Copy link
Author

Yeah, you’re right, I missed adding a buffer to the recorder, my bad.
I'm so sorry to bother you but I’m rewriting the project from OpenGL to Metal and beginner in this topic.
I tried to write without the filter and understood that I need to merge 2 buffers, but I didn't understand how to do it

  func captureOutput(_ output: AVCaptureOutput, didOutput sampleBuffer: CMSampleBuffer, from connection: AVCaptureConnection) {
        guard let formatDescription = sampleBuffer.formatDescription else {
            return
        }
        switch formatDescription.mediaType {
        case .audio:
            do {
                try self.recorder?.appendSampleBuffer(sampleBuffer)
            } catch {
                handleError(error)
            }
        case .video:
            guard let pixelBuffer = CMSampleBufferGetImageBuffer(sampleBuffer) else { return }
            
            DispatchQueue.main.async {
                self.cameraOutputView.image = MTIImage(cvPixelBuffer: pixelBuffer,
                                                       alphaType: .alphaIsOne)
                
                let overlayPixelBuffer = try! self.viewSnapshoter.snapshot(
                    self.overlayView,
                    afterScreenUpdates: true,
                    renderScale: 1
                )
                
                //need to merge?
                
                try? self.recorder?.appendSampleBuffer(
                    SampleBufferUtilities.makeSampleBufferByReplacingImageBuffer(of: sampleBuffer, with: pixelBuffer)!
                )
            }
        default:
            break
        }
    }

@YuAo
Copy link
Member

YuAo commented Apr 22, 2022

You need to use a filter to compose the two image together, you can use a MultilayerCompositingFilter or a normal blend filter. Make sure you read the "CameraFilterView" example.

@wesselpeder
Copy link

@MelnykovDenys are you able to share how you got it to work?

@MelnykovDenys
Copy link
Author

MelnykovDenys commented May 17, 2022

@MelnykovDenys are you able to share how you got it to work?

unfortunately not, because it doesn’t work

@YuAo
Copy link
Member

YuAo commented May 19, 2022

@YuAo YuAo closed this as completed May 19, 2022
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

3 participants