Dive into the world of video on Apple platforms, exploring ways to integrate video functionalities within your iOS,iPadOS, macOS, tvOS, visionOS or watchOS app.

Video Documentation

Posts under Video subtopic

Post

Replies

Boosts

Views

Activity

Retrieving the DRM expiration time for FairPlay offline assets on iOS
I’m implementing FairPlay offline streaming on iOS and ran into a question about DRM expiration handling. As far as I understand, when issuing a FairPlay offline license, there are typically two time windows: 1. The period during which the user can start offline playback (the longer “rental window”). 2. Once playback starts, the duration allowed to complete playback (the shorter “playback window”). I’d like to display this information (the remaining validity or expiration time) in the app’s UI next to each downloaded asset. My question is: 👉 Is there a way to programmatically check or retrieve the expiration time for a FairPlay offline asset on the client side (via AVFoundation or AVContentKeySession)? Any guidance or best practices for surfacing DRM expiration info in the UI would be greatly appreciated.
0
0
147
2d
Accessing External Timecode from Blackmagic ProDock in Custom App
Hi everyone, I’m exploring using the iPhone 17 Pro with the Blackmagic ProDock in a custom capture app. The genlock functionality seems accessible via AVExternalSyncDevice and related APIs, which is great. I’m specifically curious about external timecode coming in from the ProDock: • Is there a public way to access the timecode feed in a custom app via AVFoundation or another Apple API? • If so, what is the recommended approach to read or apply that timecode during capture? • Are there any current limitations or entitlements required to access timecode from ProDock in a third-party app? I’m excited to start integrating synchronized capture in my app, and any guidance or sample patterns would be greatly appreciated. Thanks in advance! — [Artem]
0
0
104
3d
Enterprise API Passthrough in screen capture not working after visionOS26 update
We have been using passthrough in screen capture since visionOS26 with broadcast upload extension which was working in visionOS2.2 but now with visionOS26 it doesn't update. It fails with Invalid Broadcast session started, after a few seconds of starting the broadcast session. Is there a bug filed for it? or is it a known bug for it?
2
0
330
4d
"No signal" message when connecting LG tv via HDM
Hi everyone, I am currently on MacOS Tahoe (26.1), and for some weird reason my mac is not connecting via HDMI. To be accurate: it is connecting and the LG TV shows up in the Displays settings, but no image shows up in it, I have no idea why. This used to work as I've tried this cable before with the same exact tv. The cable is a basic Amazon Basics HDMI one. Allow me just to advanced this question a little: usually terminal commands are more advanced recommendations, whereas basic questions like "have you connected it right" are just a waste of time
4
0
640
6d
iPhone 17 Pro, 17 Pro Maxで撮った映像をWebRTCで送るとブラウザで表示した際に緑の映像になる
iPhoneで撮影した映像をブラウザのアプリへ送信して画面に映す機能を持ったアプリを開発しています。 iPhone 17 Pro, 17 Pro Maxでこのアプリを利用するとブラウザ側に表示される映像が緑一色や、緑がメインのカラフルな映像になってしまいます。 調べてみると17Proと17ProMaxで超広角カメラと望遠カメラの画素数が変更になっている(1200万画素→4800万画素)ためエンコーディングで失敗しているのではないかと疑っています。 なんでも情報下さい。 環境情報 WebRTCライブラリ: GoogleWebRTC バージョン 1.1 (CocoaPodsで導入) シグナリングサーバー: AWS Kinesis Video Streams 問題が発生するデバイス: モデル名: iPhone18,1, OS: 26.0 モデル名: iPhone18,1, OS: 26.1 問題が発生しないデバイス: iPhone17,5 以前の多数のモデル モデル名: iPhone18,1, OS: 26.0 モデル名: iPhone18,3, OS: 26.0
1
0
53
1w
SBS and OU ViewPacking
SBS ViewPacking add a half a frame to the opposite eye. Meaning if you look all the way right you can see an extra half frame with left eye and vice versa. OU doesn't work at all, the preview just doesn't show a thumbnail and the video doesn't play. Any hints on how to fix this? I submitted a bug report but haven't heard anything.
0
0
220
1w
WKWebView Crashes on iOS During YouTube Playlist Playback
I’m encountering a consistent crash in WebKit when using WKWebView to play a YouTube playlist in my iOS app. Playback starts successfully, but the web process terminates during the second video in the playlist. This only occurs on physical devices, not in the simulator. Here’s a simplified Swift example of my setup: import SwiftUI import WebKit struct ContentView: View { private let playlistID = "PLig2mjpwQBZnghraUKGhCqc9eAy0UbpDN" var body: some View { YouTubeWebView(playlistID: playlistID) .edgesIgnoringSafeArea(.all) } } struct YouTubeWebView: UIViewRepresentable { let playlistID: String func makeUIView(context: Context) -> WKWebView { let config = WKWebViewConfiguration() config.allowsInlineMediaPlayback = true let webView = WKWebView(frame: .zero, configuration: config) webView.scrollView.isScrollEnabled = true let html = """ <!doctype html> <html> <head> <meta name="viewport" content="initial-scale=1.0, maximum-scale=1.0"> <style>body,html{height:100%;margin:0;background:#000}iframe{width:100%;height:100%;border:0}</style> </head> <body> <iframe src="https://www.youtube-nocookie.com/embed/videoseries?list=\(playlistID)&controls=1&rel=0&playsinline=1&iv_load_policy=3" frameborder="0" allow="encrypted-media; picture-in-picture; fullscreen" webkit-playsinline allowfullscreen ></iframe> </body> </html> """ webView.loadHTMLString(html, baseURL: nil) return webView } func updateUIView(_ uiView: WKWebView, context: Context) {} } #Preview { ContentView() } Observed behavior: First video plays without issue. Web process crashes when the second video in the playlist starts. Console logs show WebProcessProxy::didClose and repeated memory status messages. Using ProcessAssertion or background activity does not prevent the crash. Only occurs on physical devices; simulators do not reproduce the issue. Questions: Is there something I should change or add in my WKWebView setup or HTML/iframe to prevent the crash when playing the second video in a playlist on physical iOS devices? Is there an officially supported way to limit memory or prevent WebKit from terminating the web process during multi-video playback? Are there recommended patterns for playing YouTube playlists in a WKWebView on iOS without risking crashes? Any tips for debugging or configuring WKWebView to make it more stable for continuous playlist playback? Thanks in advance for any guidance!
2
0
303
1w
Help! Green Video stream from iPhone 17 Pro/Pro Max with WebRTC
I'm at my wit's end with a problem I'm facing while developing an app. The app is designed to send video captured on an iPhone to a browser application for real-time display. While it works on many older iPhone models, whenever I test it on an iPhone 17 Pro or 17 Pro Max, the video displayed in the browser becomes a solid green screen, or a colorful, garbled image that's mostly green. I've been digging into this, and my main suspicion is an encoding failure. It seems the resolution of the ultra-wide and telephoto cameras was significantly increased on the 17 Pro and Pro Max (from 12MP to 48MP), and I think this might be overwhelming the encoder. I'm really hoping someone here has encountered a similar issue or has any suggestions. I'm open to any information or ideas you might have. Please help! Environment Information: WebRTC Library: GoogleWebRTC Version 1.1 (via CocoaPods) Signaling Server: AWS Kinesis Video Streams Problem Occurs on: Model: iPhone18,1, OS: 26.0 Model: iPhone18,1, OS: 26.1 Works Fine on: Many models before iPhone17,5 Model: iPhone18,1, OS: 26.0 Model: iPhone18,3, OS: 26.0
0
0
60
2w
Background GPU access in iOS 26 for iPhones
We build mobile apps for creators to edit their videos. Post editing the video, the creator has to export the video so that it can be uploaded to Youtube. The export is a time consuming and GPU intensive process. The creator can exit the app due to various reasons like receiving the call, putting the app in background etc. This causes the export to fail :( Keeping this limitation in mind there was an announcement from Apple that with the IOS 26 launch would start to support background GPU access. Here is the official documentation: https://developer.apple.com/documentation/BundleResources/Entitlements/com.apple.developer.background-tasks.continued-processing.gpu When we tried using this feature, we were not able to get it to work on IOS 26. We stumbled upon this ticket(https://developer.apple.com/forums/thread/797538?answerId=854825022#854825022) in the Apple Developer forum, in which possibly an Apple engineer claims it is supported ONLY for iPadOS 26. This is a very big bummer for us. 96% of the users are on iPhone(compared to iPad), and if we refer to the official documentation above, it claims that this feature should work on IOS 26. This feature is extremely important for having the best user experience and reducing user frustration and will be useful for other video editing apps. Looking forward to a resolution.
1
0
172
3w
Broadcast UploadExtension Stop data transmission
Currently, I am using the Broadcast UploadExtension function to obtain samplebuffer data through APP Group and IPC (based on the local Unix Domain Socket) The screen recording data transmission method of the domain socket is transmitted to the APP. However, when the APP goes back to the background to view videos in the album or other audio and video, the data transmission stops and the APP cannot obtain the screen recording data. I would like to ask how to solve this problem. I suspect that the system has suspended the extended screen recording
0
0
104
3w
Video freezing with FairPlay streaming on iOS 18
Since iOS/iPadOs/tvOS 18 then we have run into a new problem with streaming of FairPlay encrypted video. On the affected streams then the audio plays perfectly but the video freezes for periods of a few seconds, so it will freeze for 5s or so, then be OK for a few seconds then freeze again. It is entirely reproducible when all the following are true the video streams were produced by a particular encoder (or particular settings, not sure on that) the video must be encrypted device is running some variety of iOS 18 (or iPadOS or tvOS) the device is an affected device Known devices are AppleTV 4K 2nd Gen iPad Pro 11" 1st and 2nd gen Devices known not to show the problem are all other AppleTV models iPhone 13 Pro and 16 Pro If we stream the same content, but unencrypted, then it plays perfectly, or if you play the encrypted stream on, say, tvOS 17. When the freezing occurs then we can see in the console logs repeating blocks of lines like the following default 18:08:46.578582+0000 videocodecd AppleAVD: AppleAVDDecodeFrameResponse(): Frame# 5771 DecodeFrame failed with error 0x0000013c default 18:08:46.578756+0000 videocodecd AppleAVD: AppleAVDDecodeFrameInternal(): failed - error: 316 default 18:08:46.579018+0000 videocodecd AppleAVD: AppleAVDDecodeFrameInternal(): avdDec - Frame# 5771, DecodeFrame failed with error: 0x13c default 18:08:46.579169+0000 videocodecd AppleAVD: AppleAVDDisplayCallback(): Asking fig to drop frame # 5771 with err -12909 - internalStatus: 315 also more relevant looking lines: default 18:17:39.122019+0000 kernel AppleAVD: avdOutbox0ISR(): FRM DONE (cid: 2.0, fno: 10970, codecT: 1) FAILED!! default 18:17:39.122155+0000 videocodecd AppleAVD: AppleAVDDisplayCallback(): Asking fig to drop frame # 10970 with err -12909 - internalStatus: 315 default 18:17:39.122221+0000 kernel AppleAVD: ## client[ 2.0] @ frm 10970, errStatus: 0x10 default 18:17:39.122338+0000 kernel AppleAVD: decodeFailIdentify(): VP error bit 4 has EP3B0 error default 18:17:39.122401+0000 kernel AppleAVD: processHWResponse(): clientID 2.0 frameNumber 10970 error 315, offsetIndex 10, isHwErr 1 So it would seem to me that one of the following must be happening: When these particular HLS files are encrypted then the data is being corrupted in some way that played back on iOS 17 and earlier but now won't on 18+, or There's a regression in iOS 18 that means that this particular format of video data is corrupted on decryption If anyone has seen similar behaviour, or has any ideas how to identify which of the two scenarios it is, please say. Unfortunately we don't have control of the servers so can't make changes there unless we can identify they are definitely the cause of the problem. Thanks, Simon.
2
0
678
4w
On iOS26, in our video playback app(use AVPlayer), the sound and video are out of sync when playing after seeking.
Our app plays TS files on an iPhone. The app fragments the TS files, creates an M3U8 playlist, converts them to HLS(HTTP Live Streaming), and then uses AVPlayer to play the video content. On a device running iOS 26, after starting playback and seeking, restarting playback causes the video and audio to be out of sync (by about 2-3 seconds depending on the situation). This also occurs on iPadOS/macOS 26. This issue was not observed prior to iOS 18. We are trying to fix this issue on the app side, but we have the following questions: The behavior of AVPlayer is different between iOS 26 and previous versions. Has there been any change that could be considered? Or is it a bug? We tried pausing before seeking, but it didn’t seem to have any effect. Are there any APIs or workarounds that can improve this? We would appreciate it if you could tell us any other helpful documents or URLs.
0
0
190
4w
Memory leak on processing stereoscopic video frame, makeMutablePixelBuffer()
Hi, I downloaded and ran https://developer.apple.com/documentation/realitykit/rendering-stereoscopic-video-with-realitykit and noticed that memory usage grows linearly. I replaced the sample video with a different 8k side by side video, and the app crashed almost immediately due to memory leak. it looks like the culprit is from makeMutablePixelBuffer() function and the allocated pixelBuffers are not recycled after being used. screenshot is from a physical device.
0
0
301
Sep ’25
CoreMediaErrorDomain error -12848
Good day. A video I created via iOS AVAssetWriter with the following settings: let videoWriterInput = AVAssetWriterInput( mediaType: .video, outputSettings: [ AVVideoCodecKey: AVVideoCodecType.hevc, AVVideoWidthKey: 1080, AVVideoHeightKey: 1920, AVVideoCompressionPropertiesKey: [ AVVideoAverageBitRateKey: 2_000_000, AVVideoMaxKeyFrameIntervalKey: 30 ], ] ) let audioWriterInput = AVAssetWriterInput( mediaType: .audio, outputSettings: [ AVFormatIDKey: kAudioFormatMPEG4AAC, AVNumberOfChannelsKey: 2, AVSampleRateKey: 44100, AVEncoderBitRateKey: 128000 ] ) When It is split into fMP4 HLS format using ffmpeg, the video is unable to be played in iOS with the following error: CoreMediaErrorDomain error -12848 However, the video is played normally in Android, Browser HLS players, and also VLC Media Player. Please assist. Thank you.
1
0
353
Sep ’25