When previewing the recording of gameplay the buttons to exit or save are unclickable behind the top bar clock and Wi-Fi/5G status bar. Which means that you have to quit the game in order to continue.
Tested on multiple devices.
Does anyone have a solution to this? At the moment we have disabled it altogether for iOS 26 users.
ReplayKit
RSS for tagRecord or stream video from the screen and audio from the app and microphone using ReplayKit.
Posts under ReplayKit tag
27 Posts
Selecting any option will automatically load the page
Post
Replies
Boosts
Views
Activity
Summary
App Store Connect validation (Transporter) is rejecting a build that includes a ReplayKit Broadcast Upload extension. The validator reports that RPBroadcastProcessMode is “not specified”, but the shipped IPA’s Upload appex Info.plist has the key at the documented nested path, and Apple’s own analyser (swinfo) shows the same key/value.
Error (Transporter 409)
“Invalid Info.plist value. The value for the key 'RPBroadcastProcessMode' in bundle BeamRoomHost.app/PlugIns/BeamRoomUpload2.appex is invalid. RPBroadcastProcessMode value must be 'RPBroadcastProcessModeSampleBuffer' or 'RPBroadcastProcessModeMP4Clip'. The key was not specified.”
Example error ID seen: 94ec8b42-ef1b-44e8-9d70-2c76458e1bb3
Environment
• Xcode 26.0.1 (17A400)
• macOS 15.6 (24G84)
• Transporter 1.3.4 (13410)
• App Apple ID: 6752822011
• Host bundle: com.conornolan.BeamRoomHost
• Upload appex bundle: com.conornolan.BeamRoomHost.BeamRoomUpload2
• Version/Build: 0.9.4 (14)
Most recent reproduction: 2025-10-02 ~09:00 GMT+1
Proof the key exists (from the IPA)
Inside the IPA at: Payload/BeamRoomHost.app/PlugIns/BeamRoomUpload2.appex/Info.plist
NSExtension.NSExtensionPointIdentifier = com.apple.broadcast-services-upload
NSExtension.NSExtensionAttributes.RPBroadcastProcessMode = RPBroadcastProcessModeSampleBuffer
Proof from Apple’s analyser (swinfo) for the same IPA
asset-description.plist contains:
… RPBroadcastProcessMode = RPBroadcastProcessModeSampleBuffer …
Minimal plist shape used (Upload appex Info.plist)
CFBundleDevelopmentRegionen-GB
CFBundleExecutable$(EXECUTABLE_NAME)
CFBundleIdentifier$(PRODUCT_BUNDLE_IDENTIFIER)
CFBundleInfoDictionaryVersion6.0
CFBundleName$(PRODUCT_NAME)
CFBundlePackageTypeXPC!
CFBundleShortVersionString$(MARKETING_VERSION)
CFBundleVersion$(CURRENT_PROJECT_VERSION)
MinimumOSVersion26.0
NSExtension
NSExtensionPointIdentifiercom.apple.broadcast-services-upload
NSExtensionPrincipalClass$(PRODUCT_MODULE_NAME).SampleHandler
NSExtensionAttributes
RPBroadcastProcessModeRPBroadcastProcessModeSampleBuffer
What we’ve tried
• Fresh Upload appex target from Xcode template + NEW bundle ID.
• Minimal Info.plist (only keys above).
• Also tried a top-level duplicate of RPBroadcastProcessMode in addition to the nested key.
• Tried RPBroadcastProcessModeMP4Clip (disposable build) → still reported “key not specified”.
• Organizer vs Transporter uploads.
• Xcode 26.0 → 26.0.1.
In every case the IPA and swinfo show the key/value, yet ASC reports “not specified”.
Questions
• Is there a current ingest/validator issue where RPBroadcastProcessMode is not read from the nested path in Upload appex Info.plist?
• Are there any additional expectations for ReplayKit Upload appex in ASC validation beyond the documented nested key?
• Any recommended workaround while this is investigated?
Cross-references
• Feedback Assistant ID: FB20412340 (contains full IPA, asset-description.plist, screenshots, and a short Transporter screen recording).
• Developer Support case: 102707552863.
• Recent x-apple-request-uuid example: 30aa2221-3df3-10a2-b161-b59df37f080c.
• SHA-256 for the ZIP of artefacts attached in Feedback: f7397c13e85d4ef0f5722ea75821ad04d51fe5f103ed03dbac646d3902e91227.
Happy to provide a minimal repro project if helpful. Thanks in advance for any guidance or confirmation from the ASC/build-processing side.
Topic:
App Store Distribution & Marketing
SubTopic:
App Store Connect
Tags:
App Store Connect
TestFlight
ReplayKit
Is it possible to start screen recording (through Control Center) without user prompt?
I mean to ask user permission for the first time and after that to start and stop recording programmatically only?
I need to record screen only for specific events.
It's a Broadcast Extension issue: on iOS 26.1 beta the extension never launches—after you tap “Start Broadcast” in the system picker the countdown disappears after 3 s and no broadcast starts, so every live-streaming app(and all other non-system apps that use Broadcast Extension) fails to go live (only the native Photos screen recording still works). Is this a known regression or is a new entitlement required?
Hi everyone, I'm totally new to this and am just having fun making an app for myself. I'm attempting to get a broadcastupload extension working, but whatever i do i cant get the replaykit to work. I keep getting this error in xcode:
Provisioning profile "Project v6 Broadcast Upload Development" doesn't include the com.apple.developer.replaykit.broadcast entitlement.
What I've tried:
Created separate App IDs for each target (Explicit App IDs, not Wildcard)
Enabled App Groups capability on all three App IDs in Apple Developer Portal
Selected the correct App Group for all App IDs
Added App Groups capability in Xcode for all targets and all build configurations
Created entitlements file with com.apple.developer.replaykit.broadcast: true for Broadcast Upload extension
Recreated provisioning profiles multiple times
Used manual code signing with correct certificates
I'm completely lost. I reached out directly to apple developer support and they just told me to come here...
Any help would be grgeatly appreciated.
I want to build an app for ios using react native. preferably expo.
The app will be for recording user experiences with technology. the SLUDGE that they face while navigating through technology.
I want to have basic login, signup.
The main feature would be to have 2 recording modes.
First is record the screen and the front camera simultaneously.
Second is to record the back camera and the front camera simultaneously.
I can then patch the two outputs later on that is the screen recording and the front camera clip in post processing.
I want to know if this is possible as I was told that react native and expo does not have the support yet. if not is there any library or another approach to make this app come alive.
Hi everyone,
I’m working on a concept for an iOS app that would allow a user to remotely control an Enterprise iOS device, similar to how AnyDesk or TeamViewer work on desktop.
I understand that apps like TeamViewer for iOS offer screen sharing, and some level control but not a full level control.
Before I invest further in development, I’d like to clarify a few points:
Is there any official Apple-supported way (public or private API) to allow remote control of an iOS device?
Has Apple ever approved apps that allow true remote control of iOS (not just screen sharing)?
If full control is not allowed, what are the permitted alternatives (e.g. screen broadcast via ReplayKit, remote assistance mode, etc.)?
Would such an app be considered for enterprise distribution only (via MDM), or is there a potential App Store path?
Any insight or experience from developers who’ve tried this would be very appreciated.
Thanks!
Add RPSystemBroadcastPickerView to the app,
After clicking, no method of SampleHandler is triggered
Hello all,
I have an application that uses broadcast extension of Replay kit to record the entire screen and mic sound to a file on iOS. Up until some months ago, everything was smooth. Currently I am facing the following issue.
If another app uses the microphone, then I loose the sound and never comes back. In order to debug this issue I have added a log to the processSampleBuffer, that logs a text each time it receives a .audioMic buffer type. I start the recording and everything works as expected. Later on, I go onto an app that uses microphone, and then I do not get any log for audioMic. I stop the recording on that app, but the sound never comes back. This as a result makes my video file to not have any sound at all.
In the above context I also noticed that even with Photos app broadcast extension, if you start recording a video, and you go to Speech to text feature of the keyboard, then the sound is joined. While the STT is on, there is no sound and the sound of whatever comes after STT stops is joined to the sound before the STT starts, so I guess that this is something general.
Also on the same research I did, I saw that google Meet app does not allow any microphone to be used from another app while you are in a meet (Even the STT is grayed out).
I would like to know my options here. What can I do to have a valid video file with sound? How can I not allow other apps to use the microphone while my app is recording? Is there any entitlement? How does Google Meet do that?
P.S. I have added an observer to observe the interruptions for the session and the type .began runs, but the type .ended does not, so I can not actually set the AVAudioSession to active again.
Hello everyone,
I'm working on implementing a screen sharing feature using RPSystemBroadcastPickerView and a Broadcast Upload Extension to share the entire app screen in an iOS application.
The Broadcast Upload Extension is set up following Apple's ReplayKit guidelines. However, I’m encountering an issue during the broadcast startup sequence:
❗ Problem Description
The Screen Broadcast UI appears as expected
I tap “Start Broadcast”
The countdown (3 → 2 → 1) completes
Then it immediately reverts to the "Start Broadcast" screen, and screen sharing does not begin
No error messages are displayed
None of the extension lifecycle methods (broadcastStarted(withSetupInfo:), processSampleBuffer, etc.) are called
There are no logs or crash reports, neither in the main app nor in the extension
✅ What Has Been Verified
Info.plist of the Broadcast Upload Extension includes:
NSExtensionPointIdentifier = com.apple.broadcast-services-upload
NSExtensionPrincipalClass set correctly
RPBroadcastProcessMode = RPBroadcastProcessModeSampleBuffer
preferredExtension is set properly to the extension’s bundle identifier
Extension is listed in the main app's build settings under "Frameworks, Libraries, and Embedded Content"
⚠️ Additional Concern
We noticed that in Xcode (latest version), the Broadcast Upload Extension is listed under "Embedded Frameworks" with the setting "Embed Without Signing", and there is no option to change it to "Embed & Sign". We're wondering if this could be the reason the extension fails to launch correctly at runtime, despite being detected by the broadcast picker.
❓ Questions
Has anyone faced similar issues where the broadcast never starts despite correct setup?
Could the "Embed Without Signing" be causing the system to silently cancel or ignore the extension at runtime?
Are there any provisioning profile or entitlement requirements specific to Broadcast Upload Extensions that might trigger this behavior silently?
Any insights, suggestions, or workarounds would be greatly appreciated.
Thank you in advance!
In my app, I implemented a screen recording functionality.
But there was an unexpected crash.
0
CoreFoundation
_CFRelease.cold.1 + 16
1
CoreFoundation
___CFTypeCollectionRelease
2
ReplayKit
___56-[RPScreenRecorder captureHandlerWithSample:timingData:]_block_invoke + 148
3
libdispatch.dylib
__dispatch_call_block_and_release + 32
4
libdispatch.dylib
__dispatch_client_callout + 16
5
libdispatch.dylib
__dispatch_lane_serial_drain + 740
6
libdispatch.dylib
__dispatch_lane_invoke + 388
7
libdispatch.dylib
__dispatch_root_queue_drain_deferred_wlh + 292
8
libdispatch.dylib
__dispatch_workloop_worker_thread + 540
9
libsystem_pthread.dylib
__pthread_wqthread + 292
Hi,
I am developing an iOS app that includes a ReplayKit Broadcast Upload Extension which requires the com.apple.developer.broadcast-upload entitlement.
The app is intended for internal development and testing on my own devices and is not yet distributed on the App Store.
Even after setting com.apple.developer.broadcast-upload=true in my .entitlements file, and linking it in Build Settings > Code Signing Entitlements; my downloaded provisional profile still did not contain the broadcast-upload entitlement.
May I know if I need explicit Apple's approval for adding the broadcast-upload entitlement; even if it's just for testing on my own devices?
Thanks.
Hello all, I saw this interesting VisionOS app: https://apps.apple.com/us/app/splitscreen-multi-display/id6478007837
I was wondering if there was any documentation on the Swift APIs that were used to create this app.
I'm currently using ReplayKit for background screen recording, but I can't determine whether the screen is in landscape mode from the CMSampleBuffer. All other APIs for detecting screen orientation are foreground-based. What should I do?
Hello,
I am currently developing a game streaming application using ReplayKit and Broadcast Upload Extension. I would like to ask for your assistance regarding capturing snapshots of a WKWebView in the upload extension without adding it to a visible view hierarchy.
From my understanding, calling takeSnapshot(with:) on a WKWebView that is not added to the view hierarchy generally works for simple web pages. However, when it comes to more complex web content — such as animations or WebGL — the snapshot returns a blank or static image. I believe this is because rendering such content requires access to the GPU, which is not fully available when the web view is off-screen.
That said, I’ve observed that certain apps are able to capture live animated web content inside their broadcast upload extensions, even when the main app is terminated. This suggests that the snapshot is not being generated by the main app or from a remote server — especially since the network activity confirms the content is served locally (via localhost or local IP).
Given this, I believe there must be a way to achieve GPU-accelerated rendering for WKWebView directly within the upload extension context, without attaching it to the app's UI. I would greatly appreciate any guidance, APIs, or recommended techniques that could help me achieve this behavior correctly and within system limitations.
Thank you in advance for your support. I look forward to your advice.
Warm regards,
After logging in to the main App, turn on screen recording, then switch to the interface of another App to perform operations. After about ten-odd minutes, when returning to the main App, it was found that the app was forcefully quit by the system, and subsequent operations could not be carried out.
I have setup the extension for replaykit successfully , the bundle id and everything is correct but still the system broadcast picker view is not showing my own app to broadcast screen content when trying to do system wide broadcast.
I use startCaptureWithHandler to record screen and AVAssetWriter appendSampleBuffer: to save audio and video ,but when played the saved file audio and video are out of sync.
I don t know if it s a AVAssetWriterInputr setup problem,here is my code
NSDictionary *audioCompressionSettings = @{
AVEncoderBitRatePerChannelKey : @(64000),
AVFormatIDKey : @(kAudioFormatMPEG4AAC),
AVNumberOfChannelsKey : @(2),
AVSampleRateKey : @(44100) };
AVAssetWriterInput *audioAssetWriterInput = [AVAssetWriterInput assetWriterInputWithMediaType:AVMediaTypeAudio outputSettings:audioCompressionSettings];
audioAssetWriterInput.expectsMediaDataInRealTime = YES;
[_assetWriter addInput:audioAssetWriterInput];
NSDictionary *videoCompressSetting = @{AVVideoAverageBitRateKey:@(screenWidth*screenHeight*5),
AVVideoMaxKeyFrameIntervalKey:@(30),
AVVideoProfileLevelKey : AVVideoProfileLevelH264MainAutoLevel};
NSDictionary *codecSetting = @{AVVideoCodecKey:AVVideoCodecTypeH264,
AVVideoScalingModeKey : AVVideoScalingModeResize,
AVVideoWidthKey:@(screenWidth*2),
AVVideoHeightKey:@(screenHeight*2),
AVVideoCompressionPropertiesKey:videoCompressSetting
};
AVAssetWriterInput* videoAssetWriterInput = [AVAssetWriterInput assetWriterInputWithMediaType:AVMediaTypeVideo outputSettings:codecSetting];
videoAssetWriterInput.expectsMediaDataInRealTime = YES;
[_assetWriter addInput:videoAssetWriterInput];
I want to record screen ,and than when I call the method stopCaptureWithHandler:(nullable void (^)(NSError *_Nullable error))handler to stop recording and saving file. before call it,I check the value record of RPScreenRecorder sharedRecorder ,the value is false , It's weird! The screen is currently being recorded !
I wonder if the value of [RPScreenRecorder sharedRecorder].record will affect the method stopCaptureWithHandler:
-(void)startCaptureScreen {
[[RPScreenRecorder sharedRecorder] startCaptureWithHandler:^(CMSampleBufferRef _Nonnull sampleBuffer, RPSampleBufferType bufferType, NSError * _Nullable error) {
//code
} completionHandler:^(NSError * _Nullable error) {
//code
}];
}
- (void)stopRecordingHandler {
if([[RPScreenRecorder sharedRecorder] isRecording]){
// deal error .sometime isRecording is false
}else {
[[RPScreenRecorder sharedRecorder] stopCaptureWithHandler:^(NSError * _Nullable error) {
}];
}
}
here are my code.
I want record screen in my app,the method startCaptureWithHandler:completionHandler:,the sampleBuffer, It is supposed to exist but it has become nil, that problem is unusual in iOS 18.3.2 iPhoneXs Max