ReplayKit

RSS for tag

Record or stream video from the screen and audio from the app and microphone using ReplayKit.

ReplayKit Documentation

Posts under ReplayKit tag

41 Posts
Sort by:
Post marked as solved
5 Replies
2k Views
Hi,I am trying to record the screen and microfone inside my application using ReplayKit. It records everything without problems and at the end I present the preview through this code:recorder.stopRecording { [weak self] (preview, error) in guard preview != nil else { print("Preview not available") return } preview?.previewControllerDelegate = self self?.present(preview!, animated: true, completion: nil) }The problem is here, the RPPreviewViewController obtained in the callback does not respect the safe area for iPhone X. The buttons are too close to the edges and the title is hidden below the notch.I also tryed to embed the RPPreviewViewController into another UIViewController (a simple wrapper), but it seems like the height is fixed, so if I lower the top part the bottom one goes out of the screen.Any clue of what I am doing wrong? Any help is appreciated.Thanks!
Posted
by
Post not yet marked as solved
3 Replies
2.9k Views
Has anyone figured this out.It seems replaykit 2 always delevers frames at a fixed resolution of 886 x 1918I've tried various solutions to rotate the screen but so far none work.Using the code below crashes in the rotate stepvImageRotate90_ARGB8888(&srcBuffer, &destBuffer, factor, &color, vImage_Flags(0)) let flags = CVPixelBufferLockFlags(rawValue: 0) guard kCVReturnSuccess == CVPixelBufferLockBaseAddress(srcPixelBuffer, flags) else { return nil } defer { CVPixelBufferUnlockBaseAddress(srcPixelBuffer, flags) } guard let srcData = CVPixelBufferGetBaseAddress(srcPixelBuffer) else { print("Error: could not get pixel buffer base address") return nil } let sourceWidth = CVPixelBufferGetWidth(srcPixelBuffer) let sourceHeight = CVPixelBufferGetHeight(srcPixelBuffer) var destWidth = sourceHeight var destHeight = sourceWidth var color = UInt8(0) if factor % 2 == 0 { destWidth = sourceWidth destHeight = sourceHeight } let srcBytesPerRow = CVPixelBufferGetBytesPerRow(srcPixelBuffer) var srcBuffer = vImage_Buffer(data: srcData, height: vImagePixelCount(sourceHeight), width: vImagePixelCount(sourceWidth), rowBytes: srcBytesPerRow) let destBytesPerRow = destWidth*4 guard let destData = malloc(destHeight*destBytesPerRow) else { print("Error: out of memory") return nil } var destBuffer = vImage_Buffer(data: destData, height: vImagePixelCount(destHeight), width: vImagePixelCount(destWidth), rowBytes: destBytesPerRow) let error = vImageRotate90_ARGB8888(&srcBuffer, &destBuffer, factor, &color, vImage_Flags(0)) if error != kvImageNoError { print("Error:", error) free(destData)
Posted
by
Post not yet marked as solved
4 Replies
1.7k Views
Hello, Our use case is Screen sharing in a live video call. We use broadcast extension to capture screens and send frames. The broadcast extension has hard limit of 50MB. The screen sharing works great with iPhones. But on iPad, ReplayKit delivers larger screens, and as a result, the extension memory usages goes beyond 50MB. While using the profiler we noticed, the memory used by our code is <25MB, but on iPad ReplayKit is having memory spikes which causes memory to go beyond 50MB limits. How should I achieve screen sharing use case on iPads? What is the guideline. Any suggestion/help is appreciated. Best, Piyush
Posted
by
Post not yet marked as solved
1 Replies
1.1k Views
Hi, We have developed EnxRTCiOS framework, which is working perfect when we import in any app target. But when I am same in my broadcast extension "SampleHandler" its giving error - "Could not build Objective-C module 'EnxRTCiOS'" and my framework "'RTCCameraVideoCapturer' is unavailable: not available on iOS (App Extension) - Camera not available in app extensions." I have clean Xcode many time, deleted drive data many time, remove POD file and reinstall still not working. Kindly help me in this.
Posted
by
Post not yet marked as solved
0 Replies
441 Views
Hello, Im working on an app where the user will be drawing on the screen while it is recorded. But I want to be able to hide some parts when for example he changes colors or present some view. My idea was to simply pause the record when he changes the color then when he is done to resume it. Is that possible to do it, it seems that there is no pause/resume method in RPScreenRecorder Am I missing something ?
Posted
by
Post not yet marked as solved
1 Replies
1.1k Views
There are some problems with the screen recording app I made. The main function is to use WebView to play audio, and then use RPScreenRecorder.startCapture of ReplayKit to record video and audio of my app. When I use "UIWebView", everything works well and video and audio can be recorded, but after changing it to "WKWebView", video can still be recorded but the recorded audio is always silence. 我所做的螢幕錄製存在一些問題。 主要功能是WebView播放音頻,然後通過replayKit api startCapture屏幕錄製。 當我使用“ UIWebView”時,“ AudioApp”可以錄製聲音,但是現在將其更改為“ WKWebView”後,“ AudioApp”錄到的聲音都是靜音。 replayKit api: open func startCapture(handler captureHandler: ((CMSampleBuffer, RPSampleBufferType, Error?) -> Void)?, completionHandler: ((Error?) -> Void)? = nil) Demo code: override func viewDidAppear(_ animated: Bool) { super.viewDidAppear(animated) let url = Bundle.main.url(forResource: "Media123", withExtension: "mp3") self.playerView.loadHTMLString("<!DOCTYPE html><html><body><audio controls autoplay><source src=\"Media123.ogg\" type=\"audio/ogg\" /><source src=\"\(String(describing: url!.absoluteString))\" type=\"audio/mpeg\"></audio></body></html>", baseURL: url) } // MARK: - start ScreenRecording func startScreenRecording() { let sharedRecorder = RPScreenRecorder.shared() if (sharedRecorder.isAvailable) { if (!sharedRecorder.isRecording) { if let pathString = self.getScreenRecordDirectoryPath() { sharedRecorder.startCapture(handler: { (sampleBuffer, sampleBufferType, error) in switch(sampleBufferType) { case .video: NSLog("video") self.append(sampleBuffer: sampleBuffer, sampleBufferType: sampleBufferType) break case .audioApp: NSLog("audioApp = \(sampleBuffer)") let data = self.converBufferToData(sampleBuffer: sampleBuffer) NSLog("audioApp data = \(data)") NSLog("audioApp data = \([UInt8] (data))") self.append(sampleBuffer: sampleBuffer, sampleBufferType: sampleBufferType) break case .audioMic: NSLog("audioMic") break @unknown default: break } }, completionHandler: { (error) in if let error = error { NSLog("startCapture completionHandler error = \(error)") } else { do { try self.startWriting(outputFilePathString: pathString) } catch let error { NSLog("startWriting error = \(error)") } } }) } else { self.showSimpleAlert(viewController: self, title: NSLocalizedString("Screen Recording", comment: ""), message: "Path Error", buttonTitle: NSLocalizedString("OK", comment: "")) } } else { self.showSimpleAlert(viewController: self, title: NSLocalizedString("Screen Recording", comment: ""), message: NSLocalizedString("Already Recording", comment: ""), buttonTitle: NSLocalizedString("OK", comment: "")) } } else { self.showSimpleAlert(viewController: self, title: NSLocalizedString("Screen Recording", comment: ""), message: NSLocalizedString("Unavailable Record", comment: ""), buttonTitle: NSLocalizedString("OK", comment: "")) } } Print out the result: audioApp data = 4096 bytes audioApp data = [0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0,....
Post not yet marked as solved
2 Replies
582 Views
I have a simple macOS app, with two windows. I would like the user to be able recored each window using a button on the window. I can record the main window and save that to a file using ReplayKit - so far, so good. But I can only record the main window - even if I have all ReplayKit code in the second NSViewController, its the main widow thats being recorded. It is possible to tell ReplayKit what window to record somehow?
Posted
by
jpx
Post not yet marked as solved
1 Replies
415 Views
I use RelayKit to record screen in my app, but I want to detect if user use Record Screen from Access Control or third party app. I tried UIScreen.capturedDidChangeNotification but it doesn't check if I'm recording the screen in my app, or using Record Screen from Access Control. Do you have any ideas
Posted
by
Post not yet marked as solved
0 Replies
308 Views
I'm having an issue where I can only use ReplayKit successfully once, then I have to restart the app to get it to work again. The first time I start recording, then stop. It works. The second time one of two things happen... I get a black screen in my preview I get the following error message in stopRecording(). Failed due to failure to process the first sample buffer After that, if I try to call startRecording again I get the following error. Recording failed to start Then the above error repeats until I restart my app. One other thing to note, the only time I get the alert to ask for approval to record is the first time I use ReplayKit. The alert doesn't show again until the first time after the app is restarted. Here are my functions I'm using. func didStartRecording() {     self.startRecording()   }       @objc func startRecording() {     let recorder = RPScreenRecorder.shared()     recorder.startRecording { [unowned self] (error) in       if let unwrappedError = error {         log.error("Error trying to record using ReplayKit: \(unwrappedError.localizedDescription)")         return       }               recordingView.backgroundColor = UIColor.red       self.view.addSubview(recordingView)       recordingView.snp.makeConstraints { make in         make.top.left.right.equalToSuperview()         make.height.equalTo(50)       }               let recordingLabel = InstrLabel()       recordingLabel.text = "Recording....Tap to stop"       recordingLabel.textColor = .white       recordingView.addSubview(recordingLabel)       recordingLabel.snp.makeConstraints { make in         make.width.height.equalToSuperview().multipliedBy(0.9)         make.centerX.centerY.equalToSuperview()       }               let tapGestureRecognizer = UITapGestureRecognizer(target: self, action: #selector(stopRecording))       recordingLabel.isUserInteractionEnabled = true       recordingLabel.addGestureRecognizer(tapGestureRecognizer)     }   }       @objc func stopRecording() {     let recorder = RPScreenRecorder.shared()           recorder.stopRecording { [unowned self] (preview, error) in       DispatchQueue.main.async {         recordingView.removeFromSuperview()       }               if let error = error {         log.error("Error with stopping the recording: \(error.localizedDescription)")       }               if let unwrappedPreview = preview {         unwrappedPreview.previewControllerDelegate = self         self.pushViewController(unwrappedPreview, animated: true)       }     }   }   func previewControllerDidFinish(_ previewController: RPPreviewViewController) {     self.popViewController(animated: true)   } I'm using iPad 5th Generation 14.7.1 Is there anyway to reset the replaykit successfully so I can do another recording immediately after without restarting the app? I've seen a few other threads with the "black screen" but none of them had any solutions that I could find.
Posted
by
Post not yet marked as solved
0 Replies
816 Views
I am developing an app that sends pixel buffers from the Broadcast Upload Extension to OpenTok. When I run my broadcast extension it hits its memory limit in seconds. I have been looking for ways to reduce the size and scale of CMSampleBuffers and ended the process by first converting them to CIImage, then scaling them, and then converting them to CVPixelBuffers for sending OpenTok Servers. Unfortunately, the extension still crashes, even though I tried to reduce the pixel buffer. My code follows: First I convert the CMSampleBuffer to CVPixelBuffer in processSampleBuffer function from Sample Handler then pass CVPixelBuffer to my function along with timestamps. Here I convert the CVPixelBuffer to cIImage and scale it using cIFilter(CILanczosScaleTransform). After that, I generate Pixel Buffer from CIImage using PixelBufferPool and cIContext and then send the new buffer to OpenTok Servers using videoCaptureConsumer. func processPixelBuffer(pixelBuffer:CVPixelBuffer, timeStamp ts:CMTime) { guard let ciImage = self.scaleFilterImage(inputImage: pixelBuffer.cmIImage, withAspectRatio: 1.0, scale: CGFloat(kVideoFrameScaleFactor)) else {return} if self.pixelBufferPool == nil || self.pixelBuffer?.size != pixelBuffer.size{ self.destroyPixelBuffers() self.updateBufferPool(newWidth: Int(ciImage.extent.size.width), newHeight: Int(ciImage.extent.size.height)) guard CVPixelBufferPoolCreatePixelBuffer(kCFAllocatorDefault, self.pixelBufferPool, &self.pixelBuffer) == kCVReturnSuccess else {return} } context?.render(ciImage, to:pixelBuffer) self.videoCaptureConsumer?.consumeImageBuffer(pixelBuffer, orientation:.up, timestamp:ts, metadata:nil) } If the pixelBufferPool is nil or there is a change in the size of the pixelBuffer I update the pool. private func updateBufferPool(newWidth: Int, newHeight: Int) { let pixelBufferAttributes: [String: Any] = [ kCVPixelBufferPixelFormatTypeKey as String: UInt(self.videoFormat), kCVPixelBufferWidthKey as String: newWidth, kCVPixelBufferHeightKey as String: newHeight, kCVPixelBufferIOSurfacePropertiesKey as String: [:] ] CVPixelBufferPoolCreate(nil,nil, pixelBufferAttributes as NSDictionary?, &pixelBufferPool) } This is the function I use to scale the cIImage: func scaleFilterImage(inputImage:CIImage, withAspectRatio aspectRatio:CGFloat, scale:CGFloat) -> CIImage? { scaleFilter?.setValue(inputImage, forKey:kCIInputImageKey) scaleFilter?.setValue(scale, forKey:kCIInputScaleKey) scaleFilter?.setValue(aspectRatio, forKey:kCIInputAspectRatioKey) return scaleFilter?.outputImage } My question is why it still keeps crashing and is there another way to reduce the CVPixelBuffer size without causing a memory limit crash? I would appreciate any help on this. Swift or Objective - C, I am open to all suggestions.
Posted
by
Post not yet marked as solved
0 Replies
271 Views
While i'm sharing my screen, I get an error message “Live broadcast to: has stopped due to: Attempted to start an invalid broadcast session”. Could I know what is the cause of this error What can I do to fix this error in the future?
Posted
by
Post not yet marked as solved
0 Replies
336 Views
I am simply trying to capture the screen, app audio and mic audio. The app audio and mic audio independently work fine but when combined some unknown error is thrown. Following are the methods to start the capture of screen and processSampleBuffer. Method to setup writers for screen capture func startCapture() { _filename = UUID().uuidString let videoPath = FileManager.default.urls(for: .documentDirectory, in: .userDomainMask).first!.appendingPathComponent("\(_filename).mp4") let writer = try! AVAssetWriter(outputURL: videoPath, fileType: .mp4) let screen = UIScreen.main.bounds let screenBounds = screen.size let videoCompressionPropertys = [ AVVideoAverageBitRateKey: screenBounds.width * screenBounds.height * 10.1 ] let videoSettings: [String: Any] = [ AVVideoCodecKey: AVVideoCodecType.h264, AVVideoWidthKey: screenBounds.width, AVVideoHeightKey: screenBounds.height, AVVideoCompressionPropertiesKey: videoCompressionPropertys ] let input = AVAssetWriterInput(mediaType: .video, outputSettings: videoSettings) input.expectsMediaDataInRealTime = true if writer.canAdd(input) { writer.add(input) } // Add the app audio input var acl = AudioChannelLayout() memset(&acl, 0, MemoryLayout<AudioChannelLayout>.size) acl.mChannelLayoutTag = kAudioChannelLayoutTag_Mono; let audioOutputSettings: [String: Any] = [ AVFormatIDKey: kAudioFormatMPEG4AAC, AVSampleRateKey : 44100, AVNumberOfChannelsKey : 1, AVEncoderBitRateKey : 128000, AVChannelLayoutKey : Data(bytes: &acl, count: MemoryLayout<AudioChannelLayout>.size)] let audioInput = AVAssetWriterInput(mediaType: AVMediaType.audio, outputSettings: audioOutputSettings) audioInput.expectsMediaDataInRealTime = true if (writer.canAdd(audioInput)) { writer.add(audioInput) } // Add the mic audio input let audioOutputSettings1: [String: Any] = [ AVFormatIDKey: kAudioFormatMPEG4AAC, AVSampleRateKey : 24000, AVNumberOfChannelsKey : 1, AVEncoderAudioQualityKey: AVAudioQuality.high.rawValue] let micAudioInput = AVAssetWriterInput(mediaType: AVMediaType.audio, outputSettings: audioOutputSettings1) micAudioInput.expectsMediaDataInRealTime = true if (writer.canAdd(micAudioInput)) { writer.add(micAudioInput) } writer.startWriting() _audioAssetWriterInput = audioInput _micAssetWriterInput = micAudioInput _assetWriterInput = input _assetWriter = writer } processSampleBuffer override func processSampleBuffer(_ sampleBuffer: CMSampleBuffer, with sampleBufferType: RPSampleBufferType) { if startTime == nil { startTime = CMSampleBufferGetPresentationTimeStamp(sampleBuffer) _assetWriter!.startSession(atSourceTime: CMTime.zero) } if sampleBufferType == RPSampleBufferType.video { if let _assetWriterInput = _assetWriterInput { if _assetWriterInput.isReadyForMoreMediaData { let appended = _assetWriterInput.append(sampleBuffer) print(appended) if !appended { let status = _assetWriter?.status let error = _assetWriter?.error print("cannot append video") } } } } if sampleBufferType == RPSampleBufferType.audioApp { if let _assetWriterInput = _audioAssetWriterInput { if _assetWriterInput.isReadyForMoreMediaData { let appended = _assetWriterInput.append(sampleBuffer) print(appended) if !appended { let status = _assetWriter?.status let error = _assetWriter?.error print("cannot append app audio") } } } } if sampleBufferType == RPSampleBufferType.audioMic { if let _assetWriterInput = _micAssetWriterInput { if _assetWriterInput.isReadyForMoreMediaData { let appended = _assetWriterInput.append(sampleBuffer) print(appended) if !appended { let status = _assetWriter?.status let error = _assetWriter?.error print("cannot append mic audio") } } } } if shouldEnd { _finishWriters() } }
Posted
by
Post not yet marked as solved
0 Replies
254 Views
My goal is to implement a screen sharing function like Skype, Discord etc. My biggest problem in that implementation is that when you have an active RPScreenRecorder.shared() screen recording and then leave the app, the screen recording stops. For my goal it is necessary to be able to share the screen even if the app is in background. For example: The local user shares his current screen to another user, then exits the app to show the other user his Photo Library in realtime. Problem 1: I have came across ReplayKit and "RPBroadcastSampleHandler", but this extension is only called when I explicitly run the app with the "ScreenShare" scheme that I created. When I try to run the app on the normal/main scheme the extension functions aren't called. Problem 2: I can't call my pod functionalities from the "RPBroadcastSampleHandler" class. It just says "-PodName- not found" when I try to import the pod I need to use. There are no full documentations on that topic and I would really appreciate anybody who can explain to me what I am doing wrong.
Posted
by
Post not yet marked as solved
0 Replies
249 Views
On iOS15.1, if users set Settings > Notifications > Screen Sharing > Notifications Off, our application cannot display notification during screen recording. I find this setting very useful, but it means that there is no way to display important messages such as no audio. Is there a way around this with any notifications?
Posted
by
Post not yet marked as solved
0 Replies
223 Views
I have a Broadcast Extension that's running a system wide screen capture. I can start the broadcast and receive buffers and the like.. Buffers are successfully getting to our back and and distributed to all the peers (webrtc). The problem is when I navigate from the host app to apps that play video with audio. And then try to go landscape, sometimes during the rotation, the system will send a broadcastFinished() message to my RPBroadcastSampleHandler subclass. It doesn't occur all the time, but when it does .. it looks like the rotation glitches some. Hard to explain. But I've noticed that it only sends the finished message when it glitches. I would love to get a screen grab of it occurring, but my capture is running.. A quick google/forum search has revealed not much help. Has anyone heard of such a thing?
Posted
by
Post marked as solved
1 Replies
302 Views
I'm developing and application that will use ReplayKit's Broadcast API to stream the screen of a iOS device. For that I'll need to create the Client App and the Broadcaster App. The WWDC videos about ReplayKit mention WWDC 2016 session 601 "Go Live with ReplayKit" but I can't find it. Can someone point me to that video or to its transcript?. Also, any other resource about the Broadcast API will be appreciated. Most of the samples and documentation that I've found on the web are for just streaming of the app's own screen. Thank you.
Posted
by
Post not yet marked as solved
2 Replies
352 Views
I'm working on a project that will use iOS (and macOS) full screen broadcasting. For this I'm doing my research into ReplayKit and working on a proof of concept app. I'm able to broadcast the screen to Mobcrush but the view of RPBroadcastActivityViewController looks like this: The obvious problem is that the view looks all wrong with all the icons missing. why? In my device I have installed Twitch, Mobcrush, and YouTube. I don't know why Twitch doesn't appear here. If I start recording the screen from Control Center then I can stream to Twitch. I'm using an iPhone 7 with iOS 15.1 Any help will be greatly appreciated.
Posted
by