Dive into the world of video on Apple platforms, exploring ways to integrate video functionalities within your iOS,iPadOS, macOS, tvOS, visionOS or watchOS app.

Video Documentation

Posts under Video subtopic

Post

Replies

Boosts

Views

Activity

Crash in iOS 18 regarding [AVPlayerController _observeValueForKeyPath:oldValue:newValue:]
There are significant crash reports coming from iOS 18 users regarding AVKit framework that starts from this line [AVPlayerController _observeValueForKeyPath:oldValue:newValue:] which seems to be coming from iOS internal SDK. There are 2 kinds of crash we found: UI modification on background thread From the stack trace it seems like when AVPictureInPictureController is being deallocated and its view is being removed from superview somehow the code is being executed in background thread because there is this line there _AssertAutoLayoutOnAllowedThreadsOnly highlighted before the crash. But I’ve checked our code that plays around AVPictureInPictureController, in the locations where we would deallocate the object it will always be called on main thread which are insideviewDidLoad and deinit inside UIViewController class. From the log, it seems like the crash happened when user try to open another content when PIP player is active resulting in the current PIP instance will be replaced with a new one. My suspect is the observation logic inside AVPlayerController could be the hint to this issue, probably something broken over there since this issue happened across our app versions on iOS 18 users only. Unfortunately, I was unable to reproduce this issue yet but one of my colleagues reproduced it once but haven’t been able to do it again since. The reports keep raising each day up to 1.3k events in the last 30 days now. Over release object This one has lower reports than the first one but I decided to include it since it might have relevant information regarding the first crash since the starting stack trace is similar. The crash timing seems to be similar to the first one, where we deallocate existing AVPictureInPictureController and later replace it with a new one and also found only in iOS 18 users which also refers to [AVPlayerController _observeValueForKeyPath:oldValue:newValue:]. I also was unable to reproduce this issue so far. Oh, and both of the issues happened on both iPhone and iPad. We’d appreciate any advice on what we can do to avoid this in the future and probably any hint on why it could happened. I have reported this issue with bug number: FB15620734 I also attached one sample crash report for each of the crashes here. non ui thread access.crash over release.crash
9
13
2.5k
Jun ’25
Capturing multiple screens no longer works with macOS Sequoia
Capturing more than one display is no longer working with macOS Sequoia. We have a product that allows users to capture up to 2 displays/screens. Our application is using gstreamer which in turn is based on AVFoundation. I found a quick way to replicate the issue by just running 2 captures from separate terminals. Assuming display 1 has device index 0, and display 2 has device index 1, here are the steps: install gstreamer with brew install gstreamer Then open 2 terminal windows and launch the following processes: terminal 1 (device-index:0): gst-launch-1.0 avfvideosrc -e device-index=0 capture-screen=true ! queue ! videoscale ! video/x-raw,width=640,height=360 ! videoconvert ! osxvideosink terminal 2 (device-index:1): gst-launch-1.0 avfvideosrc -e device-index=1 capture-screen=true ! queue ! videoscale ! video/x-raw,width=640,height=360 ! videoconvert ! osxvideosink The first process that is launched will show the screen, the second process launched will not. Testing this on macOS Ventura and Sonoma works as expected, showing both screens. I submitted the same issue on Feedback Assistant: FB15900976
2
0
387
Apr ’25
Video freezing with FairPlay streaming on iOS 18
Since iOS/iPadOs/tvOS 18 then we have run into a new problem with streaming of FairPlay encrypted video. On the affected streams then the audio plays perfectly but the video freezes for periods of a few seconds, so it will freeze for 5s or so, then be OK for a few seconds then freeze again. It is entirely reproducible when all the following are true the video streams were produced by a particular encoder (or particular settings, not sure on that) the video must be encrypted device is running some variety of iOS 18 (or iPadOS or tvOS) the device is an affected device Known devices are AppleTV 4K 2nd Gen iPad Pro 11" 1st and 2nd gen Devices known not to show the problem are all other AppleTV models iPhone 13 Pro and 16 Pro If we stream the same content, but unencrypted, then it plays perfectly, or if you play the encrypted stream on, say, tvOS 17. When the freezing occurs then we can see in the console logs repeating blocks of lines like the following default 18:08:46.578582+0000 videocodecd AppleAVD: AppleAVDDecodeFrameResponse(): Frame# 5771 DecodeFrame failed with error 0x0000013c default 18:08:46.578756+0000 videocodecd AppleAVD: AppleAVDDecodeFrameInternal(): failed - error: 316 default 18:08:46.579018+0000 videocodecd AppleAVD: AppleAVDDecodeFrameInternal(): avdDec - Frame# 5771, DecodeFrame failed with error: 0x13c default 18:08:46.579169+0000 videocodecd AppleAVD: AppleAVDDisplayCallback(): Asking fig to drop frame # 5771 with err -12909 - internalStatus: 315 also more relevant looking lines: default 18:17:39.122019+0000 kernel AppleAVD: avdOutbox0ISR(): FRM DONE (cid: 2.0, fno: 10970, codecT: 1) FAILED!! default 18:17:39.122155+0000 videocodecd AppleAVD: AppleAVDDisplayCallback(): Asking fig to drop frame # 10970 with err -12909 - internalStatus: 315 default 18:17:39.122221+0000 kernel AppleAVD: ## client[ 2.0] @ frm 10970, errStatus: 0x10 default 18:17:39.122338+0000 kernel AppleAVD: decodeFailIdentify(): VP error bit 4 has EP3B0 error default 18:17:39.122401+0000 kernel AppleAVD: processHWResponse(): clientID 2.0 frameNumber 10970 error 315, offsetIndex 10, isHwErr 1 So it would seem to me that one of the following must be happening: When these particular HLS files are encrypted then the data is being corrupted in some way that played back on iOS 17 and earlier but now won't on 18+, or There's a regression in iOS 18 that means that this particular format of video data is corrupted on decryption If anyone has seen similar behaviour, or has any ideas how to identify which of the two scenarios it is, please say. Unfortunately we don't have control of the servers so can't make changes there unless we can identify they are definitely the cause of the problem. Thanks, Simon.
2
0
764
Sep ’25
CoreMediaErrorDomain -12035 error when playing a Fairplay-protected HLS stream on iOS 18+ through the Apple lightning AV Adapter
Our iOS/AppleTV video content playback app uses AVPlayer to play HLS video streams and supports both custom and system playback UIs. The Fairplay content key is retrieved using AVContentKeySession. AirPlay is supported too. When the iPhone is connected to a TV through the lightning Apple Digital AV Adapter (A1438), the app is mirrored as expected. Problem: when using an iPhone or iPad on iOS 18.1.1, FairPlay-protected HLS streams are not played and a CoreMediaErrorDomain -12035 error is received by the AVPlayerItem. Also, once the issue has occurred, the mirroring freezes (the TV indefinitely displays the app playback screen) although the app works fine on the iOS device. The content key retrieval works as expected (I can see that 2 content key requests are made by the system by the way, probably one for the local playback and one for the adapter, as when AirPlaying) and the error is thrown after providing the AVContentKeyResponse. Unfortunately, and as far as I know, there is not documentation on CoreMediaErrorDomain errors so I don't know what -12035 means. The issue does not occur: on an iPhone on iOS 17.7 (even with FairPlay-protected HLS streams) when playing DRM-free video content (whatever the iOS version) when using the USB-C AV Adapter (whatever the iOS version) Also worth noting: the issue does not occur with other video playback apps such as Apple TV or Netflix although I don't have any details on the kind of streams these apps play and the way the FairPlay content key is retrieved (if any) so I don't know if it is relevant.
5
2
1.5k
Feb ’25
AVSampleBufferDisplayLayerContentLayer memory leaks.
I noticed that AVSampleBufferDisplayLayerContentLayer is not released when the AVSampleBufferDisplayLayer is removed and released. It is possible to reproduce the issue with the simple code: import AVFoundation import UIKit class ViewController: UIViewController { var displayBufferLayer: AVSampleBufferDisplayLayer? override func viewDidLoad() { super.viewDidLoad() let displayBufferLayer = AVSampleBufferDisplayLayer() displayBufferLayer.videoGravity = .resizeAspectFill displayBufferLayer.frame = view.bounds view.layer.insertSublayer(displayBufferLayer, at: 0) self.displayBufferLayer = displayBufferLayer DispatchQueue.main.asyncAfter(deadline: .now() + 1) { self.displayBufferLayer?.flush() self.displayBufferLayer?.removeFromSuperlayer() self.displayBufferLayer = nil } } } In my real project I have mutliple AVSampleBufferDisplayLayer created and removed in different view controllers, this is problematic because the amount of leaked AVSampleBufferDisplayLayerContentLayer keeps increasing. I wonder that maybe I should use a pool of AVSampleBufferDisplayLayer and reuse them, however I'm slightly afraid that this can also lead to strange bugs. Edit: It doesn't cause leaks on iOS 18 device but leaks on iPad Pro, iOS 17.5.1
4
1
593
Mar ’25
AVPlayer unpredictable range requests on iOS when streaming *.mov file
Hi all, I'm trying to diagnose and resolve an issue with stuttering video playback using the standard AVPlayer. The video in question is a 4K, 39-second file in *.mov format, being played on an iOS device. It's served via a local HTTP server that proxies requests to a backend to fetch and process the content. The project uses end-to-end encrypted storage, which necessitates the proxy for handling data processing. While playback in offline scenarios is smooth, we are encountering issues with smooth playback during streaming. The same video streams smoothly on other platforms using the same connection, so network limitations are not a factor. On iOS, playback is consistently choppy, with pauses every 1-3 seconds. The video does not appear to buffer adequately for smooth playback. One particularly curious aspect is the seemingly random pattern of Content-Range requests made by the AVPlayer when streaming the video. Below is an example of the range requests:
3
2
546
Apr ’25
H.265 Decoding with VideoToolBox
I am creating an app that decodes H.265 elementary streams on iOS. I use VideoToolBox to decode from H.265 to NV12. The decoded data is enqueued in the CMSampleBufferDisplayLayer as a CMSampleBuffer. However, nothing is displayed in the VideoPlayerView. It remains black. The decoding in VideoToolBox is successful. I confirmed this by saving the NV12 data in the CMSampleBuffer to a file and displaying it using a tool. Why is nothing displayed in the VideoPlayerView? I can provide other source code as well. // // ContentView.swift // H265Decoder // // Created by Kohshin Tokunaga on 2025/02/15. // import SwiftUI struct ContentView: View { var body: some View { VStack { Text("H.265 Player (temp.h265)") .font(.headline) VideoPlayerView() .frame(width: 360, height: 640) // Adjust or make it responsive for iOS } .padding() } } #Preview { ContentView() } // // VideoPlayerView.swift // H265Decoder // // Created by Kohshin Tokunaga on 2025/02/15. // import SwiftUI import AVFoundation struct VideoPlayerView: UIViewRepresentable { // Return an H265Player as the coordinator, and start playback there. func makeCoordinator() -> H265Player { H265Player() } func makeUIView(context: Context) -> UIView { let uiView = UIView(frame: .zero) // Base layer for attaching sublayers uiView.backgroundColor = .black // Screen background color (for iOS) // Create the display layer and add it to uiView.layer let displayLayer = context.coordinator.displayLayer displayLayer.frame = uiView.bounds displayLayer.backgroundColor = UIColor.clear.cgColor uiView.layer.addSublayer(displayLayer) // Start playback context.coordinator.startPlayback() return uiView } func updateUIView(_ uiView: UIView, context: Context) { // Reset the frame of the AVSampleBufferDisplayLayer when the view's size changes. let displayLayer = context.coordinator.displayLayer displayLayer.frame = uiView.layer.bounds // Optionally update the layer's background color, etc. uiView.backgroundColor = .black displayLayer.backgroundColor = UIColor.clear.cgColor // Flush transactions if necessary CATransaction.flush() } } // // H265Player.swift // H265Decoder // // Created by Kohshin Tokunaga on 2025/02/15. // import Foundation import AVFoundation import CoreMedia class H265Player: NSObject, VideoDecoderDelegate { let displayLayer = AVSampleBufferDisplayLayer() private var decoder: H265Decoder? override init() { super.init() // Initial configuration for the display layer displayLayer.videoGravity = .resizeAspect // Initialize the decoder (delegate = self) decoder = H265Decoder(delegate: self) // For simple playback, set isBaseline to true decoder?.isBaseline = true } func startPlayback() { // Load the file "cars_320x240.h265" guard let url = Bundle.main.url(forResource: "temp2", withExtension: "h265") else { print("File not found") return } do { let data = try Data(contentsOf: url) // Set FPS and video size as needed let packet = VideoPacket(data: data, type: .h265, fps: 30, videoSize: CGSize(width: 1080, height: 1920)) // Decode as a single packet decoder?.decodeOnePacket(packet) } catch { print("Failed to load file: \(error)") } } // MARK: - VideoDecoderDelegate func decodeOutput(video: CMSampleBuffer) { // When decoding is complete, send the output to AVSampleBufferDisplayLayer displayLayer.enqueue(video) } func decodeOutput(error: DecodeError) { print("Decoding error: \(error)") } }
2
0
679
May ’25
AVContentKeySession reuse
Context We develop an iOS/Apple TV app that allows to play HLS+FP Live streams (custom playback UI), some of which use the same FairPlay content key id. All FairPlay content keys are requested to the same content key server. Implementation Despite Apple documentation warning to not reuse AVContentKeySessions, we use only one AVContentKeySession for all channels which allows the system to reuse the content key when a content key id is met again. As seen in another thread, people seems to think this is OK. Issue When reusing the AVContentKeySession and the user quickly tunes channels multiple times (up to 2 or 3 times per second using gestures), an inconsistency may occur where the content key request for a previous streams is asked to the delegate after a new stream is already being prepared and its AVURLAsset already assigned as the content key session AVContentKeyRecipient. Note that the previous content key recipient is removed before the new one is added. We also have been reported for crashes (though I haven't experienced it myself) when performing multiple channels tunings which makes us think that the AVContentKeySession should definitely not been reused. Note: On the other hand if a new AVContentKeySession is used for each stream, the system systematically requests a content key even if previous streams have used the same content key id. In this case, neither the crash nor the inconsistency issue are observed but it dramatically increases the number of calls to the content key server. Questions Should AVContentKeySessions definitely not be reused? Otherwise, how to handle the inconsistency issue described above?
0
0
462
Feb ’25
Inconsistent FPS (20 FPS Issue) While Recording Video Using AVCaptureSession.
Hi, I am recording video using my app. And setting up fps also using below code. But sometime video is being recorded using 20 FPS. Can someone please let me know what I am doing wrong? private func eightBitVariantOfFormat() -> AVCaptureDevice.Format? { let activeFormat = self.videoDeviceInput.device.activeFormat let fpsToBeSupported: Int = 60 debugPrint("fpsToBeSupported - \(fpsToBeSupported)" as AnyObject) let allSupportedFormats = self.videoDeviceInput.device.formats debugPrint("all formats - \(allSupportedFormats)" as AnyObject) let activeDimensions = CMVideoFormatDescriptionGetDimensions(activeFormat.formatDescription) debugPrint("activeDimensions - \(activeDimensions)" as AnyObject) let filterBasedOnDimensions = allSupportedFormats.filter({ (CMVideoFormatDescriptionGetDimensions($0.formatDescription).width == activeDimensions.width) && (CMVideoFormatDescriptionGetDimensions($0.formatDescription).height == activeDimensions.height) }) if filterBasedOnDimensions.isEmpty { // Dimension not found. Required format not found to handle. debugPrint("Dimension not found" as AnyObject) return activeFormat } debugPrint("filterBasedOnDimensions - \(filterBasedOnDimensions)" as AnyObject) let filterBasedOnMaxFrameRate = filterBasedOnDimensions.compactMap({ format in let videoSupportedFrameRateRanges = format.videoSupportedFrameRateRanges if !videoSupportedFrameRateRanges.isEmpty { let contains = videoSupportedFrameRateRanges.contains(where: { Int($0.maxFrameRate) >= fpsToBeSupported }) if contains { return format } else { return nil } } else { return nil } }) debugPrint("allFormatsToBeSupported - \(filterBasedOnMaxFrameRate)" as AnyObject) guard !filterBasedOnMaxFrameRate.isEmpty else { debugPrint("Taking default active format as nothing found when filtered using desired FPS" as AnyObject) return activeFormat } var formatToBeUsed: AVCaptureDevice.Format! if let four_two_zero_v = filterBasedOnMaxFrameRate.first(where: { CMFormatDescriptionGetMediaSubType($0.formatDescription) == kCVPixelFormatType_420YpCbCr8BiPlanarVideoRange}) { // 'vide'/'420v' formatToBeUsed = four_two_zero_v } else { // Take the first one from above array. formatToBeUsed = filterBasedOnMaxFrameRate.first } do { try self.videoDeviceInput.device.lockForConfiguration() self.videoDeviceInput.device.activeFormat = formatToBeUsed self.videoDeviceInput.device.activeVideoMinFrameDuration = CMTimeMake(value: 1, timescale: Int32(fpsToBeSupported)) self.videoDeviceInput.device.activeVideoMaxFrameDuration = CMTimeMake(value: 1, timescale: Int32(fpsToBeSupported)) if videoDeviceInput.device.isFocusModeSupported(.continuousAutoFocus) { self.videoDeviceInput.device.focusMode = AVCaptureDevice.FocusMode.continuousAutoFocus } self.videoDeviceInput.device.unlockForConfiguration() } catch let error { debugPrint("\(error)" as AnyObject) } return formatToBeUsed }
1
0
438
Mar ’25
Issue: FPS Drops to 30 After Trimming Video in Photos App
Hi, I am recording a video at 240 FPS within my application and saving it to the Photos app. The recorded video retains 240 FPS in the Photos app. However, after trimming the video using the Photos app and importing it back into my app, the FPS is reduced to 30 FPS. Steps to Reproduce: Record a video inside the application at 240 FPS. Save the recorded video to the Photos app. Verify that the video retains 240 FPS in the Photos app. Trim the video using the built-in Photos app editor. Import the trimmed video back into the application. The FPS of the imported video is now reduced to 30 FPS. Code Used for Importing Video: I am using the following code to fetch the video from the Photos app: let options: PHVideoRequestOptions = PHVideoRequestOptions() options.version = .current // Using `.original` preserves FPS, but I need `.current` for other changes options.deliveryMode = .highQualityFormat options.isNetworkAccessAllowed = true PHImageManager.default().requestAVAsset(forVideo: self, options: options) { (avAsset, audioMix, info) in if let urlAsset = avAsset as? AVURLAsset { completionHandler(urlAsset.url, self) } else { self.askForOriginal(completionHandler: completionHandler) } } Observations: The original video retains 240 FPS until it is trimmed in the Photos app. After trimming, the FPS automatically drops to 30 FPS when imported back into the app. If I use options.version = .original, the FPS is preserved, but I need .current to apply other modifications. Questions: Is this an expected behavior of PHImageManager when requesting a video with options.version = .current? Is there a way to preserve the original FPS while still using .current? Are there any workarounds to extract the trimmed video without FPS reduction? Any insights or solutions would be greatly appreciated. Thanks in advance!
1
0
376
Mar ’25
Issue with AVAssetDownloadURLSession - Only One Audio Language track Downloaded
We are using AVAssetDownloadURLSession to download content with multiple audio tracks. Still, we are facing an issue where only one audio language is being downloaded, despite explicitly requesting multiple audio languages. However, all subtitle variants are being downloaded successfully. Issue Details: Observed Behaviour: When initiating a download using AVAssetDownloadURLSession, only one audio track (Hindi, in this case) is downloaded, even though the content contains multiple audio tracks. Expected Behaviour: All requested audio tracks should be downloaded, similar to how subtitle variants are successfully downloaded. Please find sample app implementation details: https://drive.google.com/file/d/1DLcBGNnuWFYsY0cipzxpIHqZYUDJujmN/view?usp=sharing Manifest file for the asset looks something like below #EXTM3U #EXT-X-VERSION:6 #EXT-X-INDEPENDENT-SEGMENTS #EXT-X-MEDIA:TYPE=AUDIO,GROUP-ID="A1",NAME="Hindi",LANGUAGE="hi",URI="indexHindi/Hindi.m3u8",AUTOSELECT=YES,DEFAULT=YES #EXT-X-MEDIA:TYPE=AUDIO,GROUP-ID="A2",NAME="Bengali",LANGUAGE="bn",URI="indexBengali/Bengali.m3u8",AUTOSELECT=YES,DEFAULT=YES #EXT-X-MEDIA:TYPE=AUDIO,GROUP-ID="A3",NAME="Kannada",LANGUAGE="kn",URI="indexKannada/Kannada.m3u8",AUTOSELECT=YES,DEFAULT=YES #EXT-X-MEDIA:TYPE=AUDIO,GROUP-ID="A4",NAME="Malayalam",LANGUAGE="ml",URI="indexMalayalam/Malayalam.m3u8",AUTOSELECT=YES,DEFAULT=YES #EXT-X-MEDIA:TYPE=AUDIO,GROUP-ID="A5",NAME="Tamil",LANGUAGE="ta",URI="indexTamil/Tamil.m3u8",AUTOSELECT=YES,DEFAULT=YES #EXT-X-MEDIA:TYPE=AUDIO,GROUP-ID="A6",NAME="Telugu",LANGUAGE="te",URI="indexTelugu/Telugu.m3u8",AUTOSELECT=YES,DEFAULT=YES #EXT-X-STREAM-INF:BANDWIDTH=832196,AVERAGE-BANDWIDTH=432950,RESOLUTION=640x360,FRAME-RATE=25.0,CODECS="hvc1.2.4.L93.b0,mp4a.40.2",AUDIO="A1",SUBTITLES="subs" index-4k360p/360p.m3u8 #EXT-X-STREAM-INF:BANDWIDTH=832196,AVERAGE-BANDWIDTH=432950,RESOLUTION=640x360,FRAME-RATE=25.0,CODECS="hvc1.2.4.L93.b0,mp4a.40.2",AUDIO="A2",SUBTITLES="subs" index-4k360p/360p.m3u8 #EXT-X-STREAM-INF:BANDWIDTH=832196,AVERAGE-BANDWIDTH=432950,RESOLUTION=640x360,FRAME-RATE=25.0,CODECS="hvc1.2.4.L93.b0,mp4a.40.2",AUDIO="A3",SUBTITLES="subs" index-4k360p/360p.m3u8 #EXT-X-STREAM-INF:BANDWIDTH=832196,AVERAGE-BANDWIDTH=432950,RESOLUTION=640x360,FRAME-RATE=25.0,CODECS="hvc1.2.4.L93.b0,mp4a.40.2",AUDIO="A4",SUBTITLES="subs" index-4k360p/360p.m3u8 #EXT-X-STREAM-INF:BANDWIDTH=832196,AVERAGE-BANDWIDTH=432950,RESOLUTION=640x360,FRAME-RATE=25.0,CODECS="hvc1.2.4.L93.b0,mp4a.40.2",AUDIO="A5",SUBTITLES="subs" index-4k360p/360p.m3u8 #EXT-X-STREAM-INF:BANDWIDTH=832196,AVERAGE-BANDWIDTH=432950,RESOLUTION=640x360,FRAME-RATE=25.0,CODECS="hvc1.2.4.L93.b0,mp4a.40.2",AUDIO="A6",SUBTITLES="subs" index-4k360p/360p.m3u8 #EXT-X-STREAM-INF:BANDWIDTH=1317051,AVERAGE-BANDWIDTH=607343,RESOLUTION=854x480,FRAME-RATE=25.0,CODECS="hvc1.2.4.L93.b0,mp4a.40.2",AUDIO="A1",SUBTITLES="subs" index-4k480p/480p.m3u8 #EXT-X-STREAM-INF:BANDWIDTH=1317051,AVERAGE-BANDWIDTH=607343,RESOLUTION=854x480,FRAME-RATE=25.0,CODECS="hvc1.2.4.L93.b0,mp4a.40.2",AUDIO="A2",SUBTITLES="subs" index-4k480p/480p.m3u8 #EXT-X-STREAM-INF:BANDWIDTH=1317051,AVERAGE-BANDWIDTH=607343,RESOLUTION=854x480,FRAME-RATE=25.0,CODECS="hvc1.2.4.L93.b0,mp4a.40.2",AUDIO="A3",SUBTITLES="subs" index-4k480p/480p.m3u8 #EXT-X-STREAM-INF:BANDWIDTH=1317051,AVERAGE-BANDWIDTH=607343,RESOLUTION=854x480,FRAME-RATE=25.0,CODECS="hvc1.2.4.L93.b0,mp4a.40.2",AUDIO="A4",SUBTITLES="subs" index-4k480p/480p.m3u8 #EXT-X-STREAM-INF:BANDWIDTH=1317051,AVERAGE-BANDWIDTH=607343,RESOLUTION=854x480,FRAME-RATE=25.0,CODECS="hvc1.2.4.L93.b0,mp4a.40.2",AUDIO="A5",SUBTITLES="subs" index-4k480p/480p.m3u8 #EXT-X-STREAM-INF:BANDWIDTH=1317051,AVERAGE-BANDWIDTH=607343,RESOLUTION=854x480,FRAME-RATE=25.0,CODECS="hvc1.2.4.L93.b0,mp4a.40.2",AUDIO="A6",SUBTITLES="subs" index-4k480p/480p.m3u8 #EXT-X-STREAM-INF:BANDWIDTH=1715498,AVERAGE-BANDWIDTH=717018,RESOLUTION=1024x576,FRAME-RATE=25.0,CODECS="hvc1.2.4.L123.b0,mp4a.40.2",AUDIO="A1",SUBTITLES="subs" index-4k576p/576p.m3u8 #EXT-X-STREAM-INF:BANDWIDTH=1715498,AVERAGE-BANDWIDTH=717018,RESOLUTION=1024x576,FRAME-RATE=25.0,CODECS="hvc1.2.4.L123.b0,mp4a.40.2",AUDIO="A2",SUBTITLES="subs" index-4k576p/576p.m3u8 #EXT-X-STREAM-INF:BANDWIDTH=1715498,AVERAGE-BANDWIDTH=717018,RESOLUTION=1024x576,FRAME-RATE=25.0,CODECS="hvc1.2.4.L123.b0,mp4a.40.2",AUDIO="A3",SUBTITLES="subs" index-4k576p/576p.m3u8 #EXT-X-STREAM-INF:BANDWIDTH=1715498,AVERAGE-BANDWIDTH=717018,RESOLUTION=1024x576,FRAME-RATE=25.0,CODECS="hvc1.2.4.L123.b0,mp4a.40.2",AUDIO="A4",SUBTITLES="subs" index-4k576p/576p.m3u8 #EXT-X-STREAM-INF:BANDWIDTH=1715498,AVERAGE-BANDWIDTH=717018,RESOLUTION=1024x576,FRAME-RATE=25.0,CODECS="hvc1.2.4.L123.b0,mp4a.40.2",AUDIO="A5",SUBTITLES="subs" index-4k576p/576p.m3u8 #EXT-X-STREAM-INF:BANDWIDTH=1715498,AVERAGE-BANDWIDTH=717018,RESOLUTION=1024x576,FRAME-RATE=25.0,CODECS="hvc1.2.4.L123.b0,mp4a.40.2",AUDIO="A6",SUBTITLES="subs" index-4k576p/576p.m3u8 #EXT-X-MEDIA:TYPE=SUBTITLES,GROUP-ID="subs",NAME="English",DEFAULT=YES,AUTOSELECT=YES,FORCED=NO,LANGUAGE="en",URI="subtitle_en/sub_en_vtt.m3u8"
1
6
608
Mar ’25
Using the AVPlayer to play encrypted streams causes the system to reboot
When we use AVPlayer to play DRM encrypted streams, it will not play normally under iOS 17.6.1 system version, and there is a high probability of system restart. This is the relevant core error log: error 14:47:53.323369+0800 audiomxd [AirPlayError] carManager_copyProperty_block_invoke:499: got error -12784/0xFFFFCE10 kCMBaseObjectError_PropertyNotFound error 14:47:53.323414+0800 audiomxd [SPEndpointManagerFactory] SidePlay Endpoint Manager creation failed with -72390/0xFFFEE53A error 14:47:53.364949+0800 audiomxd [APBrowserCarSessionHelper] [0xF6AA] [Bonjour/WiFi] Unrecognized ConnectivityHelper event 101 error 14:47:53.375313+0800 audiomxd AddInstanceForFactory: No factory registered for id <CFUUID 0xa5c5118c0> F8BB1C28-BAE8-11D6-9C31-00039315CD46
1
0
289
Mar ’25
SDAVAssetExportSession or AVAssetWriter fail to process iphone16 video with spatial audio tracks
I have been using SDAVAssetExportSession to compress videos in an app I am building, everything goes very smoothly until I have my new Iphone16, on the device, the spatial audio in camera setting is turned on by default, then the SDAVAssetExportSession starts to fail. I know it has something to do with audioSetting. the current setting is something like this: exportSession.audioSettings = [ AVFormatIDKey: kAudioFormatMPEG4AAC, AVNumberOfChannelsKey: 2, AVSampleRateKey: 44100, AVEncoderBitRateKey: 128000 ] And also, this is passed to the underlying object AVAssetReader or AVAssetWriter. I am not experienced in this area, and I really had a hard time trying to figure out. Does anyone know how to set up AVAssetReader or AVAssetWriter to process video with spatial audio tracks ? thanks in advance.
1
0
323
Mar ’25
iPhone 15 Pro Has USB-C, but AVCaptureDevice Doesn't Support External Devices?
I'm using an iPhone 15 Pro, which has switched from Lightning to USB Type-C. My iOS version is 18.3. According to Apple's documentation, AVCaptureDevice.DeviceType should support external device types. 🔗 Apple's Official Documentation: https://developer.apple.com/documentation/avfoundation/avcapturedevice/devicetype-swift.struct/external The documentation clearly states that iPadOS 17.0+ and iOS 17.0+ support external devices. However, in my actual tests: On iPhone, discoverySession does not detect any external devices. On iPad, discoverySession can detect external devices without any issues. My Question: Does iPhone USB-C actually support external devices (e.g., UVC cameras)? If not, why does Apple's documentation claim that iOS 17 supports external devices instead of specifying iPadOS 17 only?
1
0
541
Mar ’25
AVPlayer: Significant Delays and Asset Loss When Playing Partially Downloaded HLS Content Offline
We're experiencing significant issues with AVPlayer when attempting to play partially downloaded HLS content in offline mode. Our app downloads HLS video content for offline viewing, but users encounter the following problems: Excessive Loading Delay: When offline, AVPlayer attempts to load resources for up to 60 seconds before playing the locally available segments Asset Loss: Sometimes AVPlayer completely loses the asset reference and fails to play the video on subsequent attempts Inconsistent Behavior: The same partially downloaded asset might play immediately in one session but take 30+ seconds in another Network Activity Despite Offline Settings: Despite configuring options to prevent network usage, AVPlayer still appears to be attempting network connections These issues severely impact our offline user experience, especially for users with intermittent connectivity. Technical Details Implementation Context Our app downloads HLS videos for offline viewing using AVAssetDownloadTask. We store the downloaded content locally and maintain a dictionary mapping of file identifiers to local paths. When attempting to play these videos offline, we experience the described issues. Current Implementation Here's our current implementation for playing the videos: - (void)presentNativeAvplayerForVideo:(Video *)video navContext:(NavContext *)context { NSString *localPath = video.localHlsPath; if (localPath) { NSURL *videoURL = [NSURL URLWithString:localPath]; NSDictionary *options = @{ AVURLAssetPreferPreciseDurationAndTimingKey: @YES, AVURLAssetAllowsCellularAccessKey: @NO, AVURLAssetAllowsExpensiveNetworkAccessKey: @NO, AVURLAssetAllowsConstrainedNetworkAccessKey: @NO }; AVURLAsset *asset = [[AVURLAsset alloc] initWithURL:videoURL options:options]; AVPlayerViewController *playerViewController = [[AVPlayerViewController alloc] init]; NSArray *keys = @[@"duration", @"tracks"]; [asset loadValuesAsynchronouslyForKeys:keys completionHandler:^{ dispatch_async(dispatch_get_main_queue(), ^{ AVPlayerItem *playerItem = [AVPlayerItem playerItemWithAsset:asset]; AVPlayer *player = [AVPlayer playerWithPlayerItem:playerItem]; playerViewController.player = player; [player play]; }); }]; playerViewController.modalPresentationStyle = UIModalPresentationFullScreen; [context presentViewController:playerViewController animated:YES completion:nil]; } } Attempted Solutions We've tried several approaches to mitigate these issues: Modified Asset Options: NSDictionary *options = @{ AVURLAssetPreferPreciseDurationAndTimingKey: @NO, // Changed to NO AVURLAssetAllowsCellularAccessKey: @NO, AVURLAssetAllowsExpensiveNetworkAccessKey: @NO, AVURLAssetAllowsConstrainedNetworkAccessKey: @NO, AVAssetReferenceRestrictionsKey: @(AVAssetReferenceRestrictionForbidRemoteReferenceToLocal) }; Skipped Asynchronous Key Loading: AVPlayerItem *playerItem = [AVPlayerItem playerItemWithAsset:asset automaticallyLoadedAssetKeys:nil]; Modified Player Settings: player.automaticallyWaitsToMinimizeStalling = NO; [playerItem setPreferredForwardBufferDuration:2.0]; Added Network Resource Restrictions: playerItem.canUseNetworkResourcesForLiveStreamingWhilePaused = NO; Used File URLs Instead of HTTP URLs where possible Despite these attempts, the issues persist. Expected vs. Actual Behavior Expected Behavior: AVPlayer should immediately begin playback of locally available HLS segments When offline, it should not attempt to load from network for more than a few seconds Once an asset is successfully played, it should be reliably available for future playback Actual Behavior: AVPlayer waits 10-60 seconds before playing locally available segments Network activity is observed despite all network-restricting options Sometimes the player fails completely to play a previously available asset Behavior is inconsistent between playback attempts with the same asset Questions: What is the recommended approach for playing partially downloaded HLS content offline with minimal delay? Is there a way to force AVPlayer to immediately use available local segments without attempting to load from the network? Are there any known issues with AVPlayer losing references to locally stored HLS assets? What diagnostic steps would you recommend to track down the specific cause of these delays? Does AVFoundation have specific timeouts for offline HLS playback that could be configured? Any guidance would be greatly appreciated as this issue is significantly impacting our user experience. Device Information iOS Versions Tested: 14.5 - 18.1 Device Models: iPhone 12, iPhone 13, iPhone 14, iPhone 15 Xcode Version: 15.3-16.2.1
1
0
548
Mar ’25
Visual isTranslatable: NO; reason: observation failure: noObservations, when trying to play custom compositor video with AVPlayer
I am trying to achieve an animated gradient effect that changes values over time based on the current seconds. I am also using AVPlayer and AVMutableVideoComposition along with custom instruction and class to generate the effect. I didn't want to load any video file, but rather generate a custom video with my own set of instructions. I used Metal Compute shaders to generate the effects and make the video to be 20 seconds. However, when I run the code, I get a frozen player with the gradient applied, but when I try to play the video, I get this warning in the console :- Visual isTranslatable: NO; reason: observation failure: noObservations Here is the screenshot :- My entire code :- import AVFoundation import Metal class GradientVideoCompositorTest: NSObject, AVVideoCompositing { var sourcePixelBufferAttributes: [String: Any]? = [ kCVPixelBufferPixelFormatTypeKey as String: kCVPixelFormatType_32BGRA ] var requiredPixelBufferAttributesForRenderContext: [String: Any] = [ kCVPixelBufferPixelFormatTypeKey as String: kCVPixelFormatType_32BGRA ] private var renderContext: AVVideoCompositionRenderContext? private var metalDevice: MTLDevice! private var metalCommandQueue: MTLCommandQueue! private var metalLibrary: MTLLibrary! private var metalPipeline: MTLComputePipelineState! override init() { super.init() setupMetal() } func setupMetal() { guard let device = MTLCreateSystemDefaultDevice(), let queue = device.makeCommandQueue(), let library = try? device.makeDefaultLibrary(), let function = library.makeFunction(name: "gradientShader") else { fatalError("Metal setup failed") } self.metalDevice = device self.metalCommandQueue = queue self.metalLibrary = library self.metalPipeline = try? device.makeComputePipelineState(function: function) } func renderContextChanged(_ newRenderContext: AVVideoCompositionRenderContext) { renderContext = newRenderContext } func startRequest(_ request: AVAsynchronousVideoCompositionRequest) { guard let outputPixelBuffer = renderContext?.newPixelBuffer(), let metalTexture = createMetalTexture(from: outputPixelBuffer) else { request.finish(with: NSError(domain: "com.example.gradient", code: -1, userInfo: nil)) return } var time = Float(request.compositionTime.seconds) renderGradient(to: metalTexture, time: time) request.finish(withComposedVideoFrame: outputPixelBuffer) } private func createMetalTexture(from pixelBuffer: CVPixelBuffer) -> MTLTexture? { var texture: MTLTexture? let width = CVPixelBufferGetWidth(pixelBuffer) let height = CVPixelBufferGetHeight(pixelBuffer) let textureDescriptor = MTLTextureDescriptor.texture2DDescriptor( pixelFormat: .bgra8Unorm, width: width, height: height, mipmapped: false ) textureDescriptor.usage = [.shaderWrite, .shaderRead] CVPixelBufferLockBaseAddress(pixelBuffer, .readOnly) if let textureCache = createTextureCache(), let cvTexture = createCVMetalTexture(from: pixelBuffer, cache: textureCache) { texture = CVMetalTextureGetTexture(cvTexture) } CVPixelBufferUnlockBaseAddress(pixelBuffer, .readOnly) return texture } private func renderGradient(to texture: MTLTexture, time: Float) { guard let commandBuffer = metalCommandQueue.makeCommandBuffer(), let commandEncoder = commandBuffer.makeComputeCommandEncoder() else { return } commandEncoder.setComputePipelineState(metalPipeline) commandEncoder.setTexture(texture, index: 0) var mutableTime = time commandEncoder.setBytes(&mutableTime, length: MemoryLayout<Float>.size, index: 0) let threadsPerGroup = MTLSize(width: 16, height: 16, depth: 1) let threadGroups = MTLSize( width: (texture.width + 15) / 16, height: (texture.height + 15) / 16, depth: 1 ) commandEncoder.dispatchThreadgroups(threadGroups, threadsPerThreadgroup: threadsPerGroup) commandEncoder.endEncoding() commandBuffer.commit() } private func createTextureCache() -> CVMetalTextureCache? { var cache: CVMetalTextureCache? CVMetalTextureCacheCreate(kCFAllocatorDefault, nil, metalDevice, nil, &cache) return cache } private func createCVMetalTexture(from pixelBuffer: CVPixelBuffer, cache: CVMetalTextureCache) -> CVMetalTexture? { var cvTexture: CVMetalTexture? let width = CVPixelBufferGetWidth(pixelBuffer) let height = CVPixelBufferGetHeight(pixelBuffer) CVMetalTextureCacheCreateTextureFromImage( kCFAllocatorDefault, cache, pixelBuffer, nil, .bgra8Unorm, width, height, 0, &cvTexture ) return cvTexture } } class GradientCompositionInstructionTest: NSObject, AVVideoCompositionInstructionProtocol { var timeRange: CMTimeRange var enablePostProcessing: Bool = true var containsTweening: Bool = true var requiredSourceTrackIDs: [NSValue]? = nil var passthroughTrackID: CMPersistentTrackID = kCMPersistentTrackID_Invalid init(timeRange: CMTimeRange) { self.timeRange = timeRange } } func createGradientVideoComposition(duration: CMTime, size: CGSize) -> AVMutableVideoComposition { let composition = AVMutableComposition() let instruction = GradientCompositionInstructionTest(timeRange: CMTimeRange(start: .zero, duration: duration)) let videoComposition = AVMutableVideoComposition() videoComposition.customVideoCompositorClass = GradientVideoCompositorTest.self videoComposition.renderSize = size videoComposition.frameDuration = CMTime(value: 1, timescale: 30) // 30 FPS videoComposition.instructions = [instruction] return videoComposition } #include <metal_stdlib> using namespace metal; kernel void gradientShader(texture2d<float, access::write> output [[texture(0)]], constant float &time [[buffer(0)]], uint2 id [[thread_position_in_grid]]) { float2 uv = float2(id) / float2(output.get_width(), output.get_height()); // Animated colors based on time float3 color1 = float3(sin(time) * 0.8 + 0.1, 0.6, 1.0); float3 color2 = float3(0.12, 0.99, cos(time) * 0.9 + 0.3); // Linear interpolation for gradient float3 gradientColor = mix(color1, color2, uv.y); output.write(float4(gradientColor, 1.0), id); }
1
0
374
Apr ’25
AVContentKeySession key renewal on Airplay
Our streaming app uses FairPlay-protected video streams, which previously worked fine when using AVAssetResourceLoaderDelegate to provide CKCs. Recently, we migrated to AVContentKeySession, and while everything works as expected during regular playback, we encountered an issue with AirPlay. Our CKC has a 120-second expiry, so we renew it by calling renewExpiringResponseData.. This trigger the didProvideRenewingContentKeyRequest delegate and we respond with updated CKC. However, when streaming via AirPlay, both video and audio freeze exactly after 120 seconds. To validate the issue, I tested with AVAssetResourceLoaderDelegate and found that I can reproduce the same freeze if I do not renew the key. This suggests that AirPlay is not accepting the renewed CKC when using AVContentKeySession. Additional Details: This issue occurs across different iOS versions and various AirPlay devices. The same content plays without issues when played directly on the device. The renewal process is successful, and segments continue to load, but playback remains frozen. Tried renewing the CKC bit early (100s). I also tried setting player.usesExternalPlaybackWhileExternalScreenIsActive = true, but the issue persists. We don't use persistentKey. Is there anything else that needs to be considered for proper key renewal when AirPlaying? Any help on how to fix this or confirmation if this is a known issue would be greatly appreciated.
4
2
657
Mar ’25