Dive into the world of video on Apple platforms, exploring ways to integrate video functionalities within your iOS,iPadOS, macOS, tvOS, visionOS or watchOS app.

Video Documentation

Posts under Video subtopic

Post

Replies

Boosts

Views

Activity

CoreMediaErrorDomain -12035 error when playing a Fairplay-protected HLS stream on iOS 18+ through the Apple lightning AV Adapter
Our iOS/AppleTV video content playback app uses AVPlayer to play HLS video streams and supports both custom and system playback UIs. The Fairplay content key is retrieved using AVContentKeySession. AirPlay is supported too. When the iPhone is connected to a TV through the lightning Apple Digital AV Adapter (A1438), the app is mirrored as expected. Problem: when using an iPhone or iPad on iOS 18.1.1, FairPlay-protected HLS streams are not played and a CoreMediaErrorDomain -12035 error is received by the AVPlayerItem. Also, once the issue has occurred, the mirroring freezes (the TV indefinitely displays the app playback screen) although the app works fine on the iOS device. The content key retrieval works as expected (I can see that 2 content key requests are made by the system by the way, probably one for the local playback and one for the adapter, as when AirPlaying) and the error is thrown after providing the AVContentKeyResponse. Unfortunately, and as far as I know, there is not documentation on CoreMediaErrorDomain errors so I don't know what -12035 means. The issue does not occur: on an iPhone on iOS 17.7 (even with FairPlay-protected HLS streams) when playing DRM-free video content (whatever the iOS version) when using the USB-C AV Adapter (whatever the iOS version) Also worth noting: the issue does not occur with other video playback apps such as Apple TV or Netflix although I don't have any details on the kind of streams these apps play and the way the FairPlay content key is retrieved (if any) so I don't know if it is relevant.
5
2
1.3k
Feb ’25
videoCaptureQueue would make the app crashed when I using IOS 18.4.1
Hi All I have some problem when I using the IOS 18.4.1 I have iphone16 pro and ipad Air, both are updated to IOS 18.4.1 I tried to following sample code. However, when I run the app around 30 seconds to 1 minutes, the application would be crashed When I using another Ipad with IOS 17, it would not have the same problem. https://developer.apple.com/documentation/createml/creating-an-action-classifier-model https://developer.apple.com/documentation/createml/detecting_human_actions_in_a_live_video_feed#overview%29,
6
0
108
May ’25
AV1 Hardware Decoding
Recently I've been trying to play some AV1-encoded streams on my iPhone 15 Pro Max. First, I check for hardware support: VTIsHardwareDecodeSupported(kCMVideoCodecType_AV1); // YES Then I need to create a CMFormatDescription in order to pass it into a VTDecompressionSession. I've tried the following: { mediaType:'vide' mediaSubType:'av01' mediaSpecific: { codecType: 'av01' dimensions: 394 x 852 } extensions: {{ CVFieldCount = 1; CVImageBufferChromaLocationBottomField = Left; CVImageBufferChromaLocationTopField = Left; CVPixelAspectRatio = { HorizontalSpacing = 1; VerticalSpacing = 1; }; FullRangeVideo = 0; }} } but VTDecompressionSessionCreate gives me error -8971 (codecExtensionNotFoundErr, I assume). So it has something to do with the extensions dictionary? I can't find anywhere which set of extensions is necessary for it to work 😿. VideoToolbox has convenient functions for creating descriptions of AVC and HEVC streams (CMVideoFormatDescriptionCreateFromH264ParameterSets and CMVideoFormatDescriptionCreateFromHEVCParameterSets), but not for AV1. As of today I am using XCode 15.0 with iOS 17.0.0 SDK.
8
3
8.9k
Nov ’24
AVPlayerViewController not displaying playback controls in iOS 18
Hi everyone, I’ve encountered an issue with the showsPlaybackControls property in AVPlayerViewController after updating to iOS 18. Even though it’s set to true, the native playback controls (play, pause, etc.) are no longer appearing as they used to in previous iOS versions. This behavior was consistent and worked perfectly prior to iOS 18. Additionally, I’m seeing the same problem when using the VideoPlayer in SwiftUI. The native controls that should appear by default seem to have vanished after the update. Has anyone else experienced this? Is there any workaround or additional configuration required to restore the native controls? Any help or insights would be appreciated. Thanks! struct CustomPlayerView: UIViewControllerRepresentable { let player: AVPlayer func updateUIViewController(_ playerController: AVPlayerViewController, context: Context) { playerController.player = player playerController.showsPlaybackControls = true player.play() } func makeUIViewController(context: Context) -> AVPlayerViewController { return AVPlayerViewController() } }
8
1
1.6k
Oct ’24
Writing video using AVAssetWriter, AVAssetReader, and AVSPEECHSYNTHESIZER
Hello, First, some version and software details: Software: iOS 18.1 Hardware: iPhone 14 Pro Max and later Xcode: 16.0 Summary: AVAssetReader is not concatenating a video at the beginning of the output video. The output video should contain a scene of me introducing the content, followed by a blue screen with AVSpeechSynthesizer reading out a text that I pasted above the "Generate Video" button. Details: Now, let's talk about the app. Basically, I’m developing an app that generates a video with the following features: My app will create an output video that is split into an opening scene followed by a fully blue screen. The opening scene will be taken from a video I choose from my gallery. I will read the opening video using AVAssetReader as usual. After the opening scene, I will use the content of a text read by AVSpeechSynthesizer.write(). After the opening scene, the synthesized audio will start playing while the blue screen is displayed. All of this is already defined in the attached project. Each project file has a comment at the beginning introducing its content. How to test: Write something in the field above the "Generate Video" button. For example, type "Hello, world!" Then, press the "Library" button and select a video from the gallery, about 30 seconds long. That’s it. Press the "Generate Video" button. The result I’ve experienced is a crash or failure to generate the video. Practical example of what I want to achieve: Suppose I record a 30-second video where I say, "I’m going to tell you the story of Snow White." Then, I paste the "Snow White" story into the field above the "Generate Video" button. The output video should contain me saying, "I’m going to tell you the story of Snow White." After that, the AVSpeechSynthesizer will read the story I pasted, while displaying a blue screen. I look forward to a solution. Thank you very much! convertToCMSampleBuffer.swift convertToPixelBuffer.swift createInputs.swift createVideo.swift test.swift saveVideo.swift TestApp.swift editingVideo.swift sampleReaderProvider.swift misc.swift sampleProvider.swift
8
0
1.1k
Nov ’24
Crash in iOS 18 regarding [AVPlayerController _observeValueForKeyPath:oldValue:newValue:]
There are significant crash reports coming from iOS 18 users regarding AVKit framework that starts from this line [AVPlayerController _observeValueForKeyPath:oldValue:newValue:] which seems to be coming from iOS internal SDK. There are 2 kinds of crash we found: UI modification on background thread From the stack trace it seems like when AVPictureInPictureController is being deallocated and its view is being removed from superview somehow the code is being executed in background thread because there is this line there _AssertAutoLayoutOnAllowedThreadsOnly highlighted before the crash. But I’ve checked our code that plays around AVPictureInPictureController, in the locations where we would deallocate the object it will always be called on main thread which are insideviewDidLoad and deinit inside UIViewController class. From the log, it seems like the crash happened when user try to open another content when PIP player is active resulting in the current PIP instance will be replaced with a new one. My suspect is the observation logic inside AVPlayerController could be the hint to this issue, probably something broken over there since this issue happened across our app versions on iOS 18 users only. Unfortunately, I was unable to reproduce this issue yet but one of my colleagues reproduced it once but haven’t been able to do it again since. The reports keep raising each day up to 1.3k events in the last 30 days now. Over release object This one has lower reports than the first one but I decided to include it since it might have relevant information regarding the first crash since the starting stack trace is similar. The crash timing seems to be similar to the first one, where we deallocate existing AVPictureInPictureController and later replace it with a new one and also found only in iOS 18 users which also refers to [AVPlayerController _observeValueForKeyPath:oldValue:newValue:]. I also was unable to reproduce this issue so far. Oh, and both of the issues happened on both iPhone and iPad. We’d appreciate any advice on what we can do to avoid this in the future and probably any hint on why it could happened. I have reported this issue with bug number: FB15620734 I also attached one sample crash report for each of the crashes here. non ui thread access.crash over release.crash
9
13
2.2k
Jun ’25
Reference White Calculation for HDR Video Rendering in Metal
Our multimedia application Boinx FotoMagico displays media files of various kinds with a Metal rendering engine. At the moment we still use .bgra8Unorm pixel format and sRGB color space and only render in SDR, which is increasingly a problem, as much of the video content is HDR nowadays (e.g. videos shot on an iPhone). For that reason we would like to switch to EDR rendering with .rgba16Float pixel format and extendedLinearDisplayP3 color space. We have already worked out how to do this for HDR image files, but still have a technical problem when rendering HDR video files. We are using AVFoundation to get the video frames as CVPixelBuffers and convert them to MTLTexture using a CVMetalTextureCache. MTLTextures are then further processed in various compute shaders before being rendered to screen. However the pixel values in the texture are not what we expected. Video frames appear too bright/overexposed. In WWDC21 session "Explore HDR rendering with EDR" Ken Greenebaum mentioned: “AVFoundation does not presently decode HDR formats, such as HDR10, to EDR. Consequently, these need to be adapted for use with EDR rendering. This conversion is straightforward and involves two steps. First, converting to linear light by applying the inverse transfer function. And second, dividing by the medium's reference white.” https://developer.apple.com/videos/play/wwdc2021/10161?time=1498 However, the session does not explain, how to get or calculate the correct value for "reference white". We could not find any relevant info on the web. This is why we need DTS assistance. We need the code that calculates the correct value for reference white for any kind of video, whether it is SDR or HDR, and regardless of codec and encoding. I assume that Ken Greenebaum is the best Apple engineer to ask in this case, because he recorded most of the EDR related WWDC sessions in recent years? We have written a small test app that renders a short sample video (HLG encoding). The window contains two views. The upper view uses an AVPlayerLayer and renders the video natively just like QuickTime Player. The video content looks correct here. BTW, the window background is SDR white, so that bright EDR pixels can be clearly identified, e.g. the clouds just above the mountains in the upper left corner of the sample video. You may need to lower display brightness a bit if these clouds do not appear brighter than the white window background. The bottom view uses a CAMetalLayer and low-level Metal rendering. The CVPixelBuffers we receive from AVFoundation still need to be scaled down so that SDR reference white reaches pixel value 1.0. Entering a value of 9.0 to 10.0 for reference white in the text field makes it look about right on my Studio Display. But that is just experimental for this sample video file. We need code to calculate the correct value for reference white for any kind of video file! We have a couple of questions: SDR videos should probably use 1.0 as reference white, as their encoded pixel values can already be used as is? Is this assumption correct? Different video encoding of HDR video (HLG, PQ, etc) will probably lead to different values for reference white? Is the value for reference white constant throughout a video, or can it vary over time, either scene by scene, or even frame by frame? If it can vary, does the CVPixelBuffer of the current video frame contain all the necessary metadata to calculate the correct value? Does the NSScreen.maximumExtendedDynamicRangeColorComponentValue also influence the reference white value? The attached sample project is structured in a way that the only piece of code that needs to be modified is the ViewController.sdrReferenceWhiteValue() function. Please read the comments and the #warning in this function. This is where the code for calculating the reference white value should be inserted. Here is the download link for the sample project: https://www.dropbox.com/scl/fi/4w5gmftav5xhbixu9u6pb/HDRMetalTest.zip?rlkey=n8cm02soux3rx03vplgo6h1lm&dl=0
9
2
629
Feb ’25