Dive into the world of video on Apple platforms, exploring ways to integrate video functionalities within your iOS,iPadOS, macOS, tvOS, visionOS or watchOS app.

Video Documentation

Posts under Video subtopic

Post

Replies

Boosts

Views

Activity

Broadcast UploadExtension Stop data transmission
Currently, I am using the Broadcast UploadExtension function to obtain samplebuffer data through APP Group and IPC (based on the local Unix Domain Socket) The screen recording data transmission method of the domain socket is transmitted to the APP. However, when the APP goes back to the background to view videos in the album or other audio and video, the data transmission stops and the APP cannot obtain the screen recording data. I would like to ask how to solve this problem. I suspect that the system has suspended the extended screen recording
0
0
115
Oct ’25
Help! Green Video stream from iPhone 17 Pro/Pro Max with WebRTC
I'm at my wit's end with a problem I'm facing while developing an app. The app is designed to send video captured on an iPhone to a browser application for real-time display. While it works on many older iPhone models, whenever I test it on an iPhone 17 Pro or 17 Pro Max, the video displayed in the browser becomes a solid green screen, or a colorful, garbled image that's mostly green. I've been digging into this, and my main suspicion is an encoding failure. It seems the resolution of the ultra-wide and telephoto cameras was significantly increased on the 17 Pro and Pro Max (from 12MP to 48MP), and I think this might be overwhelming the encoder. I'm really hoping someone here has encountered a similar issue or has any suggestions. I'm open to any information or ideas you might have. Please help! Environment Information: WebRTC Library: GoogleWebRTC Version 1.1 (via CocoaPods) Signaling Server: AWS Kinesis Video Streams Problem Occurs on: Model: iPhone18,1, OS: 26.0 Model: iPhone18,1, OS: 26.1 Works Fine on: Many models before iPhone17,5 Model: iPhone18,1, OS: 26.0 Model: iPhone18,3, OS: 26.0
0
0
86
Oct ’25
SBS and OU ViewPacking
SBS ViewPacking add a half a frame to the opposite eye. Meaning if you look all the way right you can see an extra half frame with left eye and vice versa. OU doesn't work at all, the preview just doesn't show a thumbnail and the video doesn't play. Any hints on how to fix this? I submitted a bug report but haven't heard anything.
0
0
243
Oct ’25
Does HEVC VideoToolbox support temporal layering of streams with B-frames?
context:Explore low-latency video encoding with VideoToolbox https://developer.apple.com/videos/play/wwdc2021/10158/ I see above post said HEVC VideoToolbox can support SVC, temporal layering of streams in low-latency mode with all P frames. my question is that Does HEVC VideoToolbox support temporal layering of streams with B-frames ?? thanks
0
0
78
Oct ’25
Adding AVCaptureMovieFileOutput and AVCaptureVideoDataOutput with ProRes422
Adding both AVCaptureMovieFileOutput and AVCaptureVideoDataOutput is supported in AVCaptureSession as seen in documentation (copied snippet below) but then when AVCaptureDevice is configured with ProRes422 codec, it fails unless one of the two outputs is removed from the capture session. It is very much reproducible on iPhone 14 pro running iOS 26.0. Prior to iOS 16, you can add an AVCaptureVideoDataOutput and an AVCaptureMovieFileOutput to the same session, but only one may have its connection active. If you attempt to enable both connections, the system chooses the movie file output as the active connection and disables the video data output’s connection. For apps that link against iOS 16 or later, this restriction no longer exists.
0
0
137
2w
General iOS/iPadOS 26 decoding bug: MP4 unexpectedly hangs, video image frozen, audio goes on
Playback of any kind of HD H.264 MP4 files (720p, 50fps) could randomly cause a stalled image. Playback does not stop in such a case. Image is frozen/stalled. Audio goes on. Timeline goes on. By tapping play/pause or scrubbing in the timeline, the playback recovers. It could also happen, if you are scrubbing in the timeline, especially to areas not loaded already (progressive MP4 download). Behaviour is always the same: image is stalled/frozen, audio goes on. To reproduce: use example project https://developer.apple.com/documentation/AVKit/playing-video-content-in-a-standard-user-interface Example file: https://www.keepinmind.info/test.mp4
0
0
135
2w
How to dynamically update an existing AVComposition when users add a new custom video clip?
I’m building a macOS video editor that uses AVComposition and AVVideoComposition. Initially, my renderer creates a composition with some default video/audio tracks: @Published var composition: AVComposition? @Published var videoComposition: AVVideoComposition? @Published var playerItem: AVPlayerItem? Then I call a buildComposition() function that inserts all the default video segments. Later in the editing workflow, the user may choose to add their own custom video clip. For this I have a function like: private func handlePickedVideo(_ url: URL) { guard url.startAccessingSecurityScopedResource() else { print("Failed to access security-scoped resource") return } let asset = AVURLAsset(url: url) let videoTracks = asset.tracks(withMediaType: .video) guard let firstVideoTrack = videoTracks.first else { print("No video track found") url.stopAccessingSecurityScopedResource() return } renderer.insertUserVideoTrack(from: asset, track: firstVideoTrack) url.stopAccessingSecurityScopedResource() } What I want to achieve is the same behavior professional video editors provide, after the composition has already been initialized and built, the user should be able to add a new video track and the composition should update live, meaning the preview player should immediately reflect the changes without rebuilding everything from scratch manually. How can I structure my AVComposition / AVMutableComposition and my rendering pipeline so that adding a new clip later updates the existing composition in real time (similar to Final Cut/Adobe Premiere), instead of needing to rebuild everything from zero? You can find a playable version of this entire setup at :- https://github.com/zaidbren/SimpleEditor
0
0
273
2w
Changing Frame Rate of External Display on iPad
Hello, As far as I know and in all of my testing there is no way for a user or a developer to change the frame rate of the video output on iPadOS. If you connect an iPad via a USB Hub or a USB to HDMI Adaptor and then connect it to an external monitor it will output at 59.94fps. I have a video app where a user monitors live video at 25fps and 30fps, they often output to an external display and there are times when the external display will stutter due to the mismatch in frame rate, ie. using 25fps and outputting at 59.94fps. I thought it was impossible to change the video output frame rate, then in V3.1 of the Blackmagic Camera App I saw an interesting change in their release notes: ‘Support for HDMI Monitoring at Sensor Rate and Resolution’ This means there is some way to modify it, not sure if this is done via a Private API that Apple has allowed Blackmagic to use. If so, how can we access this or is there a way to enable this that is undocumented? Thanks!
0
0
130
1d
iPhone Video Upload Error - reason unknown
Hi everyone, I'm developing a customization tool in which our customer can upload a mp3 or mp4 file that will be scannable through our AR application. On desktop and Android, this is working perfectly. For some reason however, on iPhone we're unable to load most of the video files. I've checked the clips, and they are .mov/h.264 files which are supported by iPhone. We're currently not sure how we can fix this issue to allow customers that own an iPhone to upload clips to our website. Any tips in the right direction are more than welcome. thanks in advance!
1
0
520
Dec ’24
How to implement Picture-in-Picture (PiP) in Flutter for iOS using LiveKit without a video URL?
I am building a video conferencing app using LiveKit in Flutter and want to implement Picture-in-Picture (PiP) mode on iOS. My goal is to display a view showing the speaker's initials or avatar during PiP mode. I successfully implemented this functionality on Android but am struggling to achieve it on iOS. I am using a MethodChannel to communicate with the native iOS code. Here's the Flutter-side code: import 'package:flutter/foundation.dart'; import 'package:flutter/services.dart'; class PipController { static const _channel = MethodChannel('pip_channel'); static Future<void> startPiP() async { try { await _channel.invokeMethod('enterPiP'); } catch (e) { if (kDebugMode) { print("Error starting PiP: $e"); } } } static Future<void> stopPiP() async { try { await _channel.invokeMethod('exitPiP'); } catch (e) { if (kDebugMode) { print("Error stopping PiP: $e"); } } } } On the iOS side, I am using AVPictureInPictureController. Since it requires an AVPlayerLayer, I had to include a dummy video URL to initialize the AVPlayer. However, this results in the dummy video’s audio playing in the background, but no view is displayed in PiP mode. Here’s my iOS code: import Flutter import UIKit import AVKit @main @objc class AppDelegate: FlutterAppDelegate { var pipController: AVPictureInPictureController? var playerLayer: AVPlayerLayer? override func application( _ application: UIApplication, didFinishLaunchingWithOptions launchOptions: [UIApplication.LaunchOptionsKey: Any]? ) -> Bool { let controller: FlutterViewController = window?.rootViewController as! FlutterViewController let pipChannel = FlutterMethodChannel(name: "pip_channel", binaryMessenger: controller.binaryMessenger) pipChannel.setMethodCallHandler { [weak self] (call: FlutterMethodCall, result: @escaping FlutterResult) in if call.method == "enterPiP" { self?.startPictureInPicture(result: result) } else if call.method == "exitPiP" { self?.stopPictureInPicture(result: result) } else { result(FlutterMethodNotImplemented) } } GeneratedPluginRegistrant.register(with: self) return super.application(application, didFinishLaunchingWithOptions: launchOptions) } private func startPictureInPicture(result: @escaping FlutterResult) { guard AVPictureInPictureController.isPictureInPictureSupported() else { result(FlutterError(code: "UNSUPPORTED", message: "PiP is not supported on this device.", details: nil)) return } // Set up the AVPlayer let player = AVPlayer(url: URL(string: "http://commondatastorage.googleapis.com/gtv-videos-bucket/sample/BigBuckBunny.mp4")!) let playerLayer = AVPlayerLayer(player: player) self.playerLayer = playerLayer // Create a dummy view let dummyView = UIView(frame: CGRect(x: 0, y: 0, width: 1, height: 1)) dummyView.isHidden = true window?.rootViewController?.view.addSubview(dummyView) dummyView.layer.addSublayer(playerLayer) playerLayer.frame = dummyView.bounds // Initialize PiP Controller pipController = AVPictureInPictureController(playerLayer: playerLayer) pipController?.delegate = self // Start playback and PiP player.play() pipController?.startPictureInPicture() print("Picture-in-Picture started") result(nil) } private func stopPictureInPicture(result: @escaping FlutterResult) { guard let pipController = pipController, pipController.isPictureInPictureActive else { result(FlutterError(code: "NOT_ACTIVE", message: "PiP is not currently active.", details: nil)) return } pipController.stopPictureInPicture() playerLayer = nil self.pipController = nil result(nil) } } extension AppDelegate: AVPictureInPictureControllerDelegate { func pictureInPictureControllerDidStartPictureInPicture(_ pictureInPictureController: AVPictureInPictureController) { print("PiP started") } func pictureInPictureControllerDidStopPictureInPicture(_ pictureInPictureController: AVPictureInPictureController) { print("PiP stopped") } } Questions: How can I implement PiP mode on iOS without using a video URL (or AVPlayerLayer)? Is there a way to display a custom UIView (like a speaker’s initials or an avatar) in PiP mode instead of requiring a video? Why does PiP not display any view, even though the dummy video URL is playing in the background? I am new to iOS development and would greatly appreciate any guidance or alternative approaches to achieve this functionality. Thank you!
1
0
751
Dec ’24
Does tvOS Support the Multiview Feature?
I've seen the Multiview feature on tvOS that displays a small grid icon when available. However, I only see this functionality in VisionOS using the AVMultiviewManager. Does a different name refer to this feature on tvOS? Relevant Links: https://www.reddit.com/r/appletv/comments/12opy5f/handson_with_the_new_multiview_split_screen/ https://www.pocket-lint.com/how-to-use-multiview-apple-tv/#:~:text=You'll%20see%20a%20grid,running%20at%20the%20same%20time.
1
0
556
Dec ’24
Is Apple Log open to developers for 3rd party apps?
Hello! I am building a video camera app and trying to implement Apple log for iPhone 15 Pro and 16 Pro. I am not seeing a lot of documentation on it and notice the amount of apps that use it on the app is rather limited. Less an 5 to be exact. Is Apple Log recording a feature that is accessible to developers? Here is a link to documentation: https://developer.apple.com/documentation/avfoundation/avcapturecolorspace/applelog
1
0
485
Jan ’25
AVAssetWriter append audio/video streams concurrently in Real time recording setup
I see in most of the old sample codes from Apple that when using AVAssetWriter to append audio, video, and metadata samples in a real time camera recording setup, calls to .append(sampleBuffer) are either synchronised using an NSLock or all the samples are sent to the asset writer on the same dispatch queue thereby preventing concurrent writes. However I can't find any documentation that calls to assetWriterInput.append(sampleBuffer) for different media samples such as Audio and Video should not be done concurrently. Is it not valid for these methods to be executed in parallel for instance? `videoSamplesAssetWriterInput.append(videoSampleBuffer)` from DispatchQueue 1 `audioSamplesAssetWriterInput.append(audioSampleBuffer)` from DispatchQueue 2
1
0
630
Jan ’25
Exporting Higher Resolution Spatial Videos in Final Cut
How do we export a spatial video at a higher resolution than 2200 x 2200? For example, I want to export a video that I edited in a spatial timeline as 4096 x 4096 resolution with spatial metadata to output as an MV-HEVC file. I looked under both Final Cut and Compressor export settings, but couldn't find a way to do this. Thanks for your help!
1
0
473
Jan ’25
Help diagnosing crash with only system code in its stack trace
Hello, I work on a video streaming app, and I have been working on this crash that we are seeing quite frequently (it is our #2 crasher at the moment). The stack trace indicates that an AVPictureInPictureController is being deallocated on a background thread. This leads to dangling AutoLayout constraints getting cleaned up, further resulting in an exception being thrown from the framework about the layout engine being accessed from a background thread. We have internal analytics which indicate that the crash occurs after the user comes back to the app after putting it in the background, and switches playback from Picture in Picture mode back to the app's regular playback UI. What has me puzzled here is that, as I'm sure you know, we have no control over the PIP UI. It is entirely system-provided, and there is none of our app's code in the stack trace. The whole process is even initiated by a KVO on an AVPlayerController property, and our app doesn't use that class directly anywhere. So how did we manage to cause this process to happen on a background thread? Add to this the fact that the bug only appeared once we switched to compiling with Xcode 16, and is overwhelmingly present only on devices running iOS 18. These factors lead me to believe that this is probably an OS issue. But before I go to file a feedback, I thought I would post here in case anyone has any ideas. I have attached an instance of the crash log to this post. 2025-01-08_17-32-45.7003_-0800-ea8d5c3323e0f1fc059cf83f6ec86377bdae1788.crash
1
0
429
Jan ’25