Dive into the world of video on Apple platforms, exploring ways to integrate video functionalities within your iOS,iPadOS, macOS, tvOS, visionOS or watchOS app.

Video Documentation

Posts under Video subtopic

Post

Replies

Boosts

Views

Activity

I’m using ScreenCaptureKit on macOS to grab frames and measure end-to-end latency (capture → my delegate callback). For each CMSampleBuffer I read:
I’m using ScreenCaptureKit on macOS to grab frames and measure end-to-end latency (capture → my delegate callback). For each CMSampleBuffer I read: let pts = CMSampleBufferGetPresentationTimeStamp(sampleBuffer).seconds to get the “capture” timestamp, and I also extract the mach-absolute display time: let attachments = CMSampleBufferGetSampleAttachmentsArray(sampleBuffer, createIfNecessary: false) as? [[SCStreamFrameInfo: Any]] let displayMach = attachments?.first?[.displayTime] as? UInt64 // convert mach ticks to seconds... Then I compare both against the current time: let now = CACurrentMediaTime() let latencyFromPTS = now - pts let latencyFromDisplay = now - displayTimeSeconds But I consistently see negative values for both calculations—i.e. the PTS or displayTime often end up numerically larger than now. This suggests that the “presentation timestamp” and the mach-absolute display time are coming from a different epoch or clock domain than CACurrentMediaTime(). Questions: Which clocks/epochs does ScreenCaptureKit use for PTS and for .displayTime? How can I align these timestamps with CACurrentMediaTime() so that now - pts and now - displayTime reliably yield non-negative real-world latencies? Any pointers on the correct clock conversions or APIs to use would be greatly appreciated.
1
0
113
May ’25
Obtain the screen rotation direction in the background
I use replaykit for system-level screen recording. I want to determine whether the screen is in landscape mode by calling back CMSamplebuffer, but CMSamplebuffer does not come with this information. The other several apis related to obtaining the screen orientation are also restricted by the background. I want to know whether the information of the screen rotation direction can be obtained in real time in the background
1
0
58
Jun ’25
WideCamera consumes more CPU that telePhotoCamera
I have beet taking images from the iOS video camera feed and have encountered an issue. When you take images form the wideCamera this consumes about half the phone's CPU. The same is not the case when you take images from the telephotoCamera video stream. Is there a way of disabling the extra processing that is being done?
1
0
46
Jun ’25
How to request for Video Subscriber SSO entitlement from Apple
Hi All. I'm working on Single-Sign-On feature in my application to let customers sign into their TV Provider. I need to add Video Subscriber SSO entitlement (com.apple.developer.video-subscriber-single-sign-on) to the app, but I found out that it's a special entitlement, need to contact Apple to enable it for my Apple account. On https://developer.apple.com/account I navigated to Support -> Contact Us -> Development and Technical -> Entitlements and ask in the email about missing entitlement (ticket ID 102478794279). The support team couldn't help me, they redirected me to the operations team. I've been waiting for a few months now but they inform me to keep waiting. Is there a better way to contact Apple and get Video Subscriber SSO entitlement in an efficient way?
1
0
76
Jun ’25
AVFoundation — MJPEG Custom-Resolution UVC Stream Not Working on macOS
Hello, I'm Soonwon. We’re currently developing a UVC camera device and trying to stream MJPEG video via AVFoundation on macOS. However, we’re running into a problem with custom resolutions. When we try to use AVFoundation on macOS to capture MJPEG video at 1000x6000, the stream is not accepted or simply doesn’t work. Lower resolutions work fine. (Interestingly, using the same device on iPadOS, we can capture the 1000x6000 MJPEG stream successfully by using AVCaptureSessionPresetInputPriority.) Is there any way to receive custom-resolution MJPEG streams (like 1000x6000) from a UVC device using AVFoundation on macOS? Are there specific session presets, entitlements, or known limitations that affect MJPEG handling at custom resolutions on macOS? Does macOS handle MJPEG differently from iPadOS in AVFoundation? Any insight or guidance would be greatly appreciated. Thank you! NSError *error = nil; if ([selectedDevice lockForConfiguration:&error]) { [session beginConfiguration]; session.sessionPreset = AVCaptureSessionPresetHigh; bool foundFormat = false; for (AVCaptureDeviceFormat *format in selectedDevice.formats) { CMVideoDimensions dims = CMVideoFormatDescriptionGetDimensions(format.formatDescription); FourCharCode pixelFormat = CMFormatDescriptionGetMediaSubType(format.formatDescription); foundFormat = true; if (dims.width == 1000 && dims.height == 6000) { selectedDevice.activeFormat = format; foundFormat = true; break; } } if(foundFormat == false) { NSLog(@"Failed to foundFormat : "); [session commitConfiguration]; return false; } NSError* error = nil; AVCaptureDeviceInput* input = [AVCaptureDeviceInput deviceInputWithDevice:selectedDevice error:&error]; if (error || ![session canAddInput:input]) { NSLog(@"Failed to add video input: %@", error.localizedDescription); [session commitConfiguration]; return false; } [session addInput:input]; AVCaptureVideoDataOutput* output = [[AVCaptureVideoDataOutput alloc] init]; output.alwaysDiscardsLateVideoFrames = YES; output.videoSettings = @{ (NSString*)kCVPixelBufferPixelFormatTypeKey : @(kCVPixelFormatType_420YpCbCr8BiPlanarVideoRange) }; [output setSampleBufferDelegate:delegate queue:queue]; if ([session canAddOutput:output]) { [session addOutput:output]; } [session commitConfiguration]; [selectedDevice unlockForConfiguration]; } else { NSLog(@"Failed to lock device for configuration: %@", error.localizedDescription); } // start~
1
0
361
Jul ’25
HDR video & screen brightness
When I play an HDR video in the iPhone Photos app, I can see the HDR effect obviously. But if this HDR video is played continuously for more than 30-40 minutes, the HDR effect will disappear and the brightness will be compressed to the SDR range. This issue will appear on any iPhone. Depending on the phone, it may be 20-30 minutes, or 30-40 minutes, or even a few minutes, such as iPhone 12 mini. Similarly, if I use AVPlayer to play and preview an HDR video, if it plays more than 30-40 minutes, the HDR effect will disappear and the screen brightness will dim. Also the currentEDRHeadroom will gradually decrease to 1 Note, test it with an HDR video longer than 1 hour, and if the video is short, please loop it. My question is how to avoid losing the HDR effect after 30-40 minutes when I use CAMetalLayer to render any HDR video.
1
0
106
Jul ’25
play videos in webM format on iOS Deveices
I would like to play videos in webM format on my iPhone. I understand that it is basically impossible to play videos in webM format on an iPhone, but is there any way to display videos in webM format? I would like to know if there is an official Swift SDK or development kit released by Apple. Or if there are any third-party products, please let me know.
1
0
266
Jul ’25
AVAssetReaderOutput.Provider Missing symbols
Recurring crash on install of any app with the new sourceVideoTrackProvider.next() dyld[41966]: Symbol not found: _$sSo19AVAssetReaderOutputC12AVFoundationE8ProviderC4nextxSgyYaKFTjTu Referenced from: <79AA2BE0-A6B4-32F5-A804-E84BBE5D1AEA> /Users/<username>/Library/Developer/Xcode/DerivedData/TrackProviderCrash-bbbhjptcxnmfdcackxtpucnunxyc/Build/Products/Debug-maccatalyst/TrackProviderCrash.app/Contents/MacOS/TrackProviderCrash.debug.dylib Expected in: <1B847AF9-7973-3B28-95C2-09E73F6DD50B> /usr/lib/swift/libswiftAVFoundation.dylib Can be reproduced with the current Xcode Beta 4 by running on to MacCatalyst and macOS https://developer.apple.com/documentation/AVFoundation/converting-projected-video-to-apple-projected-media-profile Crash goes away of you comment out lines 154-158 and 164-170 which are while let sampleBuffer = try await sourceVideoTrackProvider.next(){/*other code*/} Can also be reproduced if you add the code below to a MacCatalyst project import AVKit let asset: AVURLAsset = .init(url: Bundle.main.url(forResource: "SomeVideo.mp4", withExtension: nil)!) let videoReader = try! AVAssetReader(asset: asset) let videoTracks = try! await asset.loadTracks(withMediaCharacteristic: .visual) // Get the side-by-side video track. let videoTrack = videoTracks.first! let videoInputTrack = AVAssetReaderTrackOutput(track: videoTrack, outputSettings: nil) let sourceVideoTrackProvider: AVAssetReaderOutput.Provider<CMReadySampleBuffer<CMSampleBuffer.DynamicContent>> = videoReader.outputProvider(for: videoInputTrack) //Comment out this while let sb = try! await sourceVideoTrackProvider.next() { }
1
0
586
Jul ’25
Bluetooth headphones
Any one experience this bug, when playing a video the bluetooth headphones loose audio? my workaround is to select from one of the other audio outputs, and go back again and select the affected headphone.
1
0
90
Aug ’25
AVPlayerViewController `customInfoViewControllers` crash/workaround on tvOS 26
One thing I've noticed on tvOS 26 is that if you try to set the AVPlayerViewController customInfoViewControllers property while the Content Tabs are on screen, your app will crash. *** Terminating app due to uncaught exception 'UIViewControllerHierarchyInconsistency', reason: 'trying to add child view controller that is already presented: <AVInfoPanelViewController: 0x1030cdc00>' *** First throw call stack: (0x18a7167bc 0x189a77510 0x18a7166a8 0x1ab425658 0x1b2ee9d54 0x1b2efcd60 0x1b2eaf3f0 0x1080f744c 0x107e021a8 0x107e01b3c 0x18de41c14 0x18de41ba8 0x18de48d28 0x18ad9e358 0x101fac5f0 0x101fc6228 0x101fe7278 0x101fbc6fc 0x101fbc63c 0x18a67a2e0 0x18a679418 0x18a673b34 0x1937e4d5c 0x1abb36588 0x1abb3ae80 0x1aae9dec4 0x108610174 0x1086100e4 0x108615140 0x189abd4d0) I've logged a feedback (FB19554461) but it's getting awfully late in the dev cycle. So I've been trying to think of a workaround. The problem is that customInfoViewControllers is pretty declarative in nature. There are no properties or delegate methods I am aware of that let me know when they are displaying or not. One trick I came up with was seeing if my custom info view controller's view was "visible" or not - I put that in quotes because it turns out it can be visible even when I think it's not, as when the transport bar is scrolled to the top my custom VC still has its top pixels showing, so it gets a viewDidAppear call. So, I tried to see if my view controllers view is completely visible, ie based on the results of the GGRect contains method. And that works! But the problem is it only accounts for my own custom info view controllers, and not the standard one that Apple provides. I can't think of a way at all to know whether that is showing. Any ideas?
1
0
94
Aug ’25
High shutter speed with low frame rate and auto exposure
Hi there, I want to set the iphone camera to "S mode, or Shutter Priority" in camera terminology. Which is a semi-auto exposure model with shutter speed fixed, or set manually. However, when setting the shutter speed manually, it disables the auto exposure. So is there a way to keep the auto exposure on while restrict the shutter speed? Also, I would like to keep a low frame rate, e.g. 30 fps. Would I be able to set shutter speed independent of frame rate? Here's the code for setting up the camera Best,
1
0
487
2w
CoreMediaErrorDomain error -12848
Good day. A video I created via iOS AVAssetWriter with the following settings: let videoWriterInput = AVAssetWriterInput( mediaType: .video, outputSettings: [ AVVideoCodecKey: AVVideoCodecType.hevc, AVVideoWidthKey: 1080, AVVideoHeightKey: 1920, AVVideoCompressionPropertiesKey: [ AVVideoAverageBitRateKey: 2_000_000, AVVideoMaxKeyFrameIntervalKey: 30 ], ] ) let audioWriterInput = AVAssetWriterInput( mediaType: .audio, outputSettings: [ AVFormatIDKey: kAudioFormatMPEG4AAC, AVNumberOfChannelsKey: 2, AVSampleRateKey: 44100, AVEncoderBitRateKey: 128000 ] ) When It is split into fMP4 HLS format using ffmpeg, the video is unable to be played in iOS with the following error: CoreMediaErrorDomain error -12848 However, the video is played normally in Android, Browser HLS players, and also VLC Media Player. Please assist. Thank you.
1
0
339
2w
Background GPU access in iOS 26 for iPhones
We build mobile apps for creators to edit their videos. Post editing the video, the creator has to export the video so that it can be uploaded to Youtube. The export is a time consuming and GPU intensive process. The creator can exit the app due to various reasons like receiving the call, putting the app in background etc. This causes the export to fail :( Keeping this limitation in mind there was an announcement from Apple that with the IOS 26 launch would start to support background GPU access. Here is the official documentation: https://developer.apple.com/documentation/BundleResources/Entitlements/com.apple.developer.background-tasks.continued-processing.gpu When we tried using this feature, we were not able to get it to work on IOS 26. We stumbled upon this ticket(https://developer.apple.com/forums/thread/797538?answerId=854825022#854825022) in the Apple Developer forum, in which possibly an Apple engineer claims it is supported ONLY for iPadOS 26. This is a very big bummer for us. 96% of the users are on iPhone(compared to iPad), and if we refer to the official documentation above, it claims that this feature should work on IOS 26. This feature is extremely important for having the best user experience and reducing user frustration and will be useful for other video editing apps. Looking forward to a resolution.
1
0
154
1d
Video Hardware Acceleration on Mac
I observe significant performance differences when encoding a video in mp4 format (H264). The code I use is standard (using AVAssetWriter, AVAssetWriterInput...). Here is what I notice when I run the same code on different platforms: On an iPhone, the video is encoded in 3 seconds (iPhone 13, 14, 15, 16, Pro...). On a Mac equipped with an M2 Pro, the video is encoded in 50 seconds. On a Mac equipped with an Intel processor (2,3 GHz Intel Xeon W 18 cœurs), the video is encoded in 2 minutes. The encoding on an iPhone is very fast due to hardware acceleration. However, I don’t understand why I don’t get similar performance with a Mac M2 Pro, which is equipped with a dedicated component for hardware acceleration (H264 media engine)? Is hardware acceleration disabled on a Mac?
0
0
445
Oct ’24
Overlapping Video Frames in RPBroadcastSampleHandler with ReplayKit
I am recording video on iOS using ReplayKit and found that after copying data in the processSampleBuffer:withType: callback using memcpy, the data changes. This occurs particularly frequently when the screen content changes rapidly, making it look like the frames are overlapping. I found that the values starting from byte 672 in the video data on my device often change. Here is the test demo: - (void)processSampleBuffer:(CMSampleBufferRef)sampleBuffer withType:(RPSampleBufferType)sampleBufferType { switch (sampleBufferType) { case RPSampleBufferTypeVideo: { CVPixelBufferRef pixelBuffer = CMSampleBufferGetImageBuffer(sampleBuffer); CVPixelBufferLockBaseAddress(pixelBuffer, kCVPixelBufferLock_ReadOnly); int ret = 0; uint8_t *oYData = CVPixelBufferGetBaseAddressOfPlane(pixelBuffer, 0); size_t oYSize = CVPixelBufferGetHeightOfPlane(pixelBuffer, 0) * CVPixelBufferGetBytesPerRowOfPlane(pixelBuffer, 0); uint8_t *oUVData = CVPixelBufferGetBaseAddressOfPlane(pixelBuffer, 1); size_t oUVSize = CVPixelBufferGetHeightOfPlane(pixelBuffer, 1) * CVPixelBufferGetBytesPerRowOfPlane(pixelBuffer, 1); if (oYSize <= 672) { return; } uint8_t tempValue = oYData[672]; uint8_t *tYData = malloc(oYSize); memcpy(tYData, oYData, oYSize); if (tYData[672] != oYData[672]) { NSLog(@"$$$$$$$$$$$$$$$$------ t:%d o:%d temp:%d", tYData[672], oYData[672], tempValue); } free(tYData); CVPixelBufferUnlockBaseAddress(pixelBuffer, kCVPixelBufferLock_ReadOnly); break; } default: { break; } } } Output: $$$$$$$$$$$$$$$$------ t:110 o:124 temp:110 $$$$$$$$$$$$$$$$------ t:111 o:133 temp:111 $$$$$$$$$$$$$$$$------ t:124 o:138 temp:124 $$$$$$$$$$$$$$$$------ t:133 o:144 temp:133 $$$$$$$$$$$$$$$$------ t:138 o:151 temp:138 $$$$$$$$$$$$$$$$------ t:144 o:156 temp:144 $$$$$$$$$$$$$$$$------ t:151 o:135 temp:151 $$$$$$$$$$$$$$$$------ t:156 o:78 temp:156 $$$$$$$$$$$$$$$$------ t:135 o:76 temp:135 $$$$$$$$$$$$$$$$------ t:78 o:77 temp:78 $$$$$$$$$$$$$$$$------ t:76 o:80 temp:76 $$$$$$$$$$$$$$$$------ t:77 o:80 temp:77 $$$$$$$$$$$$$$$$------ t:80 o:79 temp:80 $$$$$$$$$$$$$$$$------ t:79 o:80 temp:79
0
0
501
Oct ’24
AirPlay fails when app is also installed on the receiver
We have a universal iOS/tvOS app that also supports iOS App on Mac. In our AVPlayer-based video player we support AirPlay with AVRouteDetector and AVRoutePickerView. We play HLS streams. When we try to AirPlay from an iOS device to an Apple TV or a Mac that has our app installed, it doesn't work. The receiver is marked as active in the route picker UI but the video doesn't show up on the receiver and playback stops. When our app isn't installed on the receiver device, everything works as expected. Has anyone encountered the same issue? Any solutions available for this?
0
0
571
Nov ’24
AVKit Video Player not working
Hey. I am trying to create a present view with a bunch of media (images/videos). Right now I am using a ZStack to render each media and change opacity based on the index selected using a scrollView. The issue seems to be that sometimes, videos don't seem to load in the main slide. There is a slide created as the video exists, the Player shows controls too but doesn't play anything. Present View Z-Stack ZStack { ForEach(presentation.slides.indices, id: .self) { index in if let media = mediaCacheManager.mediaCache[index] { if let player = media as? AVPlayer { PlayerView(player: player) .aspectRatio(16/10, contentMode: .fit ) .frame(width: UIScreen.main.bounds.width * 0.8) .background(Color.gray.opacity(0.2)) .clipShape(RoundedRectangle(cornerRadius: 40)) .overlay( RoundedRectangle(cornerRadius: 40) .stroke(Color.gray.opacity(0.5), lineWidth: 1) ) .onDisappear { player.pause() } .opacity(appModel.currentSlide == index ? 1 : 0) } else if let image = media as? Image { image .resizable() .scaledToFit() .frame(width: UIScreen.main.bounds.width * 0.8) .background(Color.gray.opacity(0.2)) .clipShape(RoundedRectangle(cornerRadius: 40)) .overlay( RoundedRectangle(cornerRadius: 40) .stroke(Color.gray.opacity(0.5), lineWidth: 1) ) .padding(.vertical, 10) .opacity(appModel.currentSlide == index ? 1 : 0) } } } } The PlayerView public class PlayerUIView: UIView { let playerVC = AVPlayerViewController() let gravity: AVLayerVideoGravity let manageAudio: Bool override init(frame: CGRect) { self.gravity = .resizeAspectFill self.manageAudio = true super.init(frame: frame) } deinit { if manageAudio { try? AVAudioSession.sharedInstance().setActive(false) } } init(player: AVPlayer?, gravity: AVLayerVideoGravity, manageAudio: Bool = true) { self.gravity = gravity self.manageAudio = manageAudio super.init(frame: .zero) guard let player = player else { return } self.playerSetup(player: player) } required init?(coder: NSCoder) { fatalError("init(coder:) has not been implemented") } public override func layoutSubviews() { super.layoutSubviews() playerVC.view.frame = bounds playerVC.view.backgroundColor = .clear playerVC.allowsVideoFrameAnalysis = false } private func playerSetup(player: AVPlayer) { playerVC.updatesNowPlayingInfoCenter = true playerVC.player = player playerVC.showsPlaybackControls = true playerVC.view.backgroundColor = .clear playerVC.exitsFullScreenWhenPlaybackEnds = true playerVC.videoGravity = gravity self.addSubview(playerVC.view) } }
0
0
503
Nov ’24
Firewire video OUTPUT to hardware 2024
Looking to output dv video to my JVC SR-VS30 video deck. I used to be able to do this, but with most firewire stuff being deprecated, I'm not sure how to go about this. I found this old developer sample code that seems to do exactly what I'd like. Surely this could be rolled or updated for current macOS? https://developer.apple.com/library/archive/samplecode/SimpleVideoOut/Introduction/Intro.html#//apple_ref/doc/uid/DTS10000809-Intro-DontLinkElementID_2
0
0
340
Nov ’24
[AVPlayer][5G] The buffer duration (preferredForwardBufferDuration) configuration property of AVPlayerItem does not work on a 5G network
I tried configuring the preferredForwardBufferDuration on devices using 4G and Wi-Fi, and in these cases, AVPlayer works correctly according to the configured buffer duration. However, when the device is connected to a 5G network, the configuration value no longer works. For example, if I set preferredForwardBufferDuration to 30 seconds, AVPlayer preloads with a buffer of over 100 seconds. I’m not sure how to resolve this, as it’s causing issues with my system.
0
0
614
Nov ’24