ReplayKit

RSS for tag

Record or stream video from the screen and audio from the app and microphone using ReplayKit.

ReplayKit Documentation

Posts under ReplayKit tag

42 Posts
Sort by:
Post not yet marked as solved
1 Replies
1.1k Views
There are some problems with the screen recording app I made. The main function is to use WebView to play audio, and then use RPScreenRecorder.startCapture of ReplayKit to record video and audio of my app. When I use "UIWebView", everything works well and video and audio can be recorded, but after changing it to "WKWebView", video can still be recorded but the recorded audio is always silence. 我所做的螢幕錄製存在一些問題。 主要功能是WebView播放音頻,然後通過replayKit api startCapture屏幕錄製。 當我使用“ UIWebView”時,“ AudioApp”可以錄製聲音,但是現在將其更改為“ WKWebView”後,“ AudioApp”錄到的聲音都是靜音。 replayKit api: open func startCapture(handler captureHandler: ((CMSampleBuffer, RPSampleBufferType, Error?) -> Void)?, completionHandler: ((Error?) -> Void)? = nil) Demo code: override func viewDidAppear(_ animated: Bool) { super.viewDidAppear(animated) let url = Bundle.main.url(forResource: "Media123", withExtension: "mp3") self.playerView.loadHTMLString("<!DOCTYPE html><html><body><audio controls autoplay><source src=\"Media123.ogg\" type=\"audio/ogg\" /><source src=\"\(String(describing: url!.absoluteString))\" type=\"audio/mpeg\"></audio></body></html>", baseURL: url) } // MARK: - start ScreenRecording func startScreenRecording() { let sharedRecorder = RPScreenRecorder.shared() if (sharedRecorder.isAvailable) { if (!sharedRecorder.isRecording) { if let pathString = self.getScreenRecordDirectoryPath() { sharedRecorder.startCapture(handler: { (sampleBuffer, sampleBufferType, error) in switch(sampleBufferType) { case .video: NSLog("video") self.append(sampleBuffer: sampleBuffer, sampleBufferType: sampleBufferType) break case .audioApp: NSLog("audioApp = \(sampleBuffer)") let data = self.converBufferToData(sampleBuffer: sampleBuffer) NSLog("audioApp data = \(data)") NSLog("audioApp data = \([UInt8] (data))") self.append(sampleBuffer: sampleBuffer, sampleBufferType: sampleBufferType) break case .audioMic: NSLog("audioMic") break @unknown default: break } }, completionHandler: { (error) in if let error = error { NSLog("startCapture completionHandler error = \(error)") } else { do { try self.startWriting(outputFilePathString: pathString) } catch let error { NSLog("startWriting error = \(error)") } } }) } else { self.showSimpleAlert(viewController: self, title: NSLocalizedString("Screen Recording", comment: ""), message: "Path Error", buttonTitle: NSLocalizedString("OK", comment: "")) } } else { self.showSimpleAlert(viewController: self, title: NSLocalizedString("Screen Recording", comment: ""), message: NSLocalizedString("Already Recording", comment: ""), buttonTitle: NSLocalizedString("OK", comment: "")) } } else { self.showSimpleAlert(viewController: self, title: NSLocalizedString("Screen Recording", comment: ""), message: NSLocalizedString("Unavailable Record", comment: ""), buttonTitle: NSLocalizedString("OK", comment: "")) } } Print out the result: audioApp data = 4096 bytes audioApp data = [0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0,....
Posted Last updated
.