ReplayKit

RSS for tag

Record or stream video from the screen and audio from the app and microphone using ReplayKit.

Posts under ReplayKit tag

12 Posts
Sort by:

Post

Replies

Boosts

Views

Activity

iOS to Android H264 encoding issue.
I'm trying to cast the screen from an iOS device to an Android device. I'm leveraging ReplayKit on iOS to capture the screen and VideoToolbox for compressing the captured video data into H.264 format using CMSampleBuffers. Both iOS and Android are configured for H.264 compression and decompression. While screen casting works flawlessly within the same platform (iOS to iOS or Android to Android), I'm encountering an error ("not in avi mode") on the Android receiver when casting from iOS. My research suggests that the underlying container formats for H.264 might differ between iOS and Android. Data transmission over the TCP socket seems to be functioning correctly. My question is: Is there a way to ensure a common container format for H.264 compression and decompression across iOS and Android platforms? Here's a breakdown of the iOS sender details: Device: iPhone 13 mini running iOS 17 Development Environment: Xcode 15 with a minimum deployment target of iOS 16 Screen Capture: ReplayKit for capturing the screen and obtaining CMSampleBuffers Video Compression: VideoToolbox for H.264 compression Compression Properties: kVTCompressionPropertyKey_ConstantBitRate: 6144000 (bitrate) kVTCompressionPropertyKey_ProfileLevel: kVTProfileLevel_H264_Main_AutoLevel (profile and level) kVTCompressionPropertyKey_MaxKeyFrameInterval: 60 (maximum keyframe interval) kVTCompressionPropertyKey_RealTime: true (real-time encoding) kVTCompressionPropertyKey_Quality: 1 (lowest quality) NAL Unit Handling: Custom header is added to NAL units Android Receiver Details: Device: RedMi 7A running Android 10 Video Decoding: MediaCodec API for receiving and decoding the H.264 stream
0
0
144
2w
Replaykit Broadcast finishing unexpectedly:Attempted to start an invalid broadcast session
I'm currently working on live screen broadcasting app which allows the user's to record their screen to save a mp4 video. I write video file by AVAssetWriter, and it works fine. But, when there is 1GB-2BG of storage space remaining on the device, errors such as "Attempted to start an invalid broadcast session" frequently occur, and video files cannot be played due to not call assetWriter.finishWriting(). Occur on device: iPhone se3 iPhone 12 pro max iPhone 13 iPad 19 iPad air 5 I have tried the movieFragmentInterval of AVAssetWriter to write movie fragments , set shouldOptimizeForNetworkUse true/false , not working.The video can not be played. I want to known how to observe or catch this error? Thanks!
0
0
288
Jan ’24
Replaykit sometimes will be auto stop screen record !!!
I just used replaykit to achieve simple screen recording,But sometimes there may be frequent occurrences of automatic stopping of recording, i don`t why ? have anyone konw this bug ? use Api: func startCapture(handler captureHandler: ((CMSampleBuffer, RPSampleBufferType, Error?) -> Void)?, completionHandler: ((Error?) -> Void)? = nil) open func stopCapture(handler: ((Error?) -> Void)? = nil) `` The prerequisite for monitoring the automatic stop recording method is that you have already started screen recording and have not actively called to stop recording let publisher = recorder.publisher(for: .isRecording) let cancelBag = Subscribers.Sink<Bool, Never>(receiveCompletion: {[weak self] complete in /// }, receiveValue: {[weak self] value in /// }) publisher.subscribe(cancelBag)
0
0
340
Nov ’23
RPScreenRecorder: Recording video
Hello everyone. I am trying to make a small utility that, in the context of digital forensics, logs the desktop. The utility is to be started via shell like this : "./nemeapp start path_to_file" and be terminated in "./nemeapp stop". The code I wrote is: import Foundation import ReplayKit let arguments = CommandLine.arguments guard arguments.count == 4 else { print("Utilizzo: nome_script start|stop percorso_file include_audio(true|false)") exit(0) } let command = arguments[1] let filePath = arguments[2] let includeAudio = arguments[3] == "true" switch command { case "start": startScreenRecording(filePath: filePath, includeAudio: includeAudio) case "stop": stopScreenRecording() default: print("Comando non riconosciuto. Utilizzo: nome_script start|stop percorso_file include_audio(true|false)") } func startScreenRecording(filePath: String, includeAudio: Bool) { if RPScreenRecorder.shared().isAvailable { RPScreenRecorder.shared().startRecording(handler: { error in if let unwrappedError = error { print("Errore durante l'avvio della registrazione: \(unwrappedError.localizedDescription)") } else { print("La registrazione dello schermo è stata avviata correttamente. Il file verrà salvato in: \(filePath)") } }) } else { print("La registrazione dello schermo non è disponibile.") } } func stopScreenRecording() { RPScreenRecorder.shared().stopRecording { previewViewController, error in if let unwrappedError = error { print("Errore durante l'arresto della registrazione: \(unwrappedError.localizedDescription)") } else { print("La registrazione dello schermo è stata interrotta correttamente.") } } } Unfortunately, the code returns no error message. Only when I give the stop command does it tell me that the registration never started. I can't even figure out if it is a permissions issue.
0
0
369
Oct ’23
iOS 17 RPSystemBroadcastPickerView not working
My existing code is working properly in iOS < 17 devices it records the iPhone screen and records audio as well simultaneously, but in iOS 17 devices the screen recording video is captured for only 2 seconds and then stops automatically, As its an extension, i don't have logs to debug the issue. I have tested the same code in other iPhones and OS less than 17, its working fine but in iOS 17 devices this issue is coming. @try { NSLog(@“initAssesWriter”); NSError *error = nil; CGRect screenRect = [[UIScreen mainScreen] bounds]; _videoWriter = [[AVAssetWriter alloc] initWithURL: _filePath fileType:AVFileTypeMPEG4 error:&error]; NSParameterAssert(_videoWriter); //Configure video NSDictionary* videoCompressionProps = [NSDictionary dictionaryWithObjectsAndKeys: [NSNumber numberWithDouble:2048*1024.0], AVVideoAverageBitRateKey, nil ]; NSDictionary* videoSettings = [NSDictionary dictionaryWithObjectsAndKeys: AVVideoCodecTypeH264, AVVideoCodecKey, [NSNumber numberWithInt:screenRect.size.width * 4], AVVideoWidthKey, [NSNumber numberWithInt:screenRect.size.height * 4], AVVideoHeightKey, videoCompressionProps, AVVideoCompressionPropertiesKey, nil]; _writerInput = [AVAssetWriterInput assetWriterInputWithMediaType:AVMediaTypeVideo outputSettings:videoSettings] ; _writerInput.expectsMediaDataInRealTime = YES; NSParameterAssert(_writerInput); NSParameterAssert([_videoWriter canAddInput:_writerInput]); [_videoWriter addInput:_writerInput]; AudioChannelLayout acl; bzero( &acl, sizeof(acl)); acl.mChannelLayoutTag = kAudioChannelLayoutTag_Mono; NSDictionary* audioOutputSettings = [NSDictionary dictionaryWithObjectsAndKeys: [ NSNumber numberWithInt: kAudioFormatMPEG4AAC], AVFormatIDKey, [ NSNumber numberWithInt: 1 ], AVNumberOfChannelsKey, [ NSNumber numberWithFloat: 44100.0 ], AVSampleRateKey, [ NSData dataWithBytes: &acl length: sizeof( AudioChannelLayout ) ], AVChannelLayoutKey, [ NSNumber numberWithInt: 64000 ], AVEncoderBitRateKey, nil]; _audioWriterInput = [AVAssetWriterInput assetWriterInputWithMediaType: AVMediaTypeAudio outputSettings: audioOutputSettings ]; _audioWriterInput.expectsMediaDataInRealTime = YES; // seems to work slightly better NSParameterAssert(_audioWriterInput); NSParameterAssert([_videoWriter canAddInput:_audioWriterInput]); [_videoWriter addInput:_audioWriterInput]; [_videoWriter setMovieFragmentInterval:CMTimeMake(1, 600)]; [_videoWriter startWriting]; } @catch (NSException *exception) { } @finally { } -(void)processSampleBuffer:(CMSampleBufferRef)sampleBuffer withType:(RPSampleBufferType)sampleBufferType{ @try { if(!_isRecordingStarted){ [_videoWriter startSessionAtSourceTime:CMSampleBufferGetPresentationTimeStamp(sampleBuffer)]; _isRecordingStarted = YES; [self saveFlurryLogs:@"Assest writer Start Recording" Details:@""]; NSLog(@"CMSampleBufferGetPresentationTimeStamp"); } } @catch (NSException *exception) { [self saveFlurryLogs:@"Recording Start Execption" Details:exception.description]; } @finally { } @try { switch (sampleBufferType) { case RPSampleBufferTypeVideo: // Handle video sample buffer if([_writerInput isReadyForMoreMediaData]){ [_writerInput appendSampleBuffer:sampleBuffer]; NSLog(@"writing matadata Video"); } break; case RPSampleBufferTypeAudioApp: // Handle audio sample buffer for app audio break; case RPSampleBufferTypeAudioMic: if([_audioWriterInput isReadyForMoreMediaData]){ [_audioWriterInput appendSampleBuffer:sampleBuffer]; NSLog(@"writing matadata Audio"); } // Handle audio sample buffer for mic audio break; default: break; } } @catch (NSException *exception) { [self saveFlurryLogs:@"Packet Write Execption" Details:exception.description]; } @finally { } }
1
1
609
Oct ’23
ReplayKit and kAudioUnitSubType_VoiceProcessingIO
We find that ReplayKit as of iOS 9.3.2 (haven't tested in iOS 10) is unable to record audio output when the audio unit sub type is kAudioUnitSubType_VoiceProcessingIO and ReplayKit is used via startRecordingWithMicrophoneEnabled(false). When we switch to using sub type kAudioUnitSubType_RemoteIO, ReplayKit works as expected. This is for a live video app where viewers receive live video and audio. We do not use AVPlayer (AVPlayer is the only incompatibility listed in the docs).Are there any workarounds that will allow use of kAudioUnitSubType_VoiceProcessingIO with ReplayKit?Thanks,Robert
2
0
1.1k
Jun ’23