Dive into the world of video on Apple platforms, exploring ways to integrate video functionalities within your iOS,iPadOS, macOS, tvOS, visionOS or watchOS app.

Video Documentation

Posts under Video subtopic

Post

Replies

Boosts

Views

Activity

Bluetooth headphones
Any one experience this bug, when playing a video the bluetooth headphones loose audio? my workaround is to select from one of the other audio outputs, and go back again and select the affected headphone.
1
0
90
Aug ’25
High shutter speed with low frame rate and auto exposure
Hi there, I want to set the iphone camera to "S mode, or Shutter Priority" in camera terminology. Which is a semi-auto exposure model with shutter speed fixed, or set manually. However, when setting the shutter speed manually, it disables the auto exposure. So is there a way to keep the auto exposure on while restrict the shutter speed? Also, I would like to keep a low frame rate, e.g. 30 fps. Would I be able to set shutter speed independent of frame rate? Here's the code for setting up the camera Best,
1
0
487
2w
Threading guarantees with AVCaptureVideoDataOutput
I'm writing some camera functionality that uses AVCaptureVideoDataOutput. I've set it up so that it calls my AVCaptureVideoDataOutputSampleBufferDelegate on a background thread, by making my own dispatch_queue and configuring the AVCaptureVideoDataOutput. My question is then, if I configure my AVCaptureSession differently, or even stop it altogether, is this guaranteed to flush all pending jobs on my background thread? I have a more practical example below, showing how I am accessing something from the foreground thread from the background thread, but I wonder when/how it's safe to clean up that resource. I have setup similar to the following: // Foreground thread logic dispatch_queue_t queue = dispatch_queue_create("avf_camera_queue", nullptr); AVCaptureSession *captureSession = [[AVCaptureSession alloc] init]; setupInputDevice(captureSession); // Connects the AVCaptureDevice... // Store some arbitrary data to be attached to the frame, stored on the foreground thread FrameMetaData frameMetaData = ...; MySampleBufferDelegate *sampleBufferDelegate = [MySampleBufferDelegate alloc]; // Capture frameMetaData by reference in lambda [sampleBufferDelegate setFrameMetaDataGetter: [&frameMetaData]() { return &frameMetaData; }]; AVCaptureVideoDataOutput *captureVideoDataOutput = [[AVCaptureVideoDataOutput alloc] init]; [captureVideoDataOutput setSampleBufferDelegate:sampleBufferDelegate queue:queue]; [captureSession addOutput:captureVideoDataOutput]; [captureSession startRunning]; [captureSession stopRunning]; // Is it now safe to destroy frameMetaData, or do we need manual barrier? And then in MySampleBufferDelegate: - (void)captureOutput:(AVCaptureOutput *)captureOutput didOutputSampleBuffer:(CMSampleBufferRef)sampleBuffer fromConnection:(AVCaptureConnection *)connection { // Invokes the callback set above FrameMetaData *frameMetaData = frameMetaDataGetter(); emitSampleBuffer(sampleBuffer, frameMetaData); }
0
0
312
Sep ’25
AVPlayer loading performance problem in iOS 26
Hi, I have an app that displays tens of short (<1mb) mp4 videos stored in a remote server in a vertical UICollectionView that has horizontally scrollable sections. I'm caching all mp4 files on disk after downloading, and I also have a in-memory cache that holds a limited number (around 30) of players. The players I'm using are simple views that wrap an AVPlayerLayer and its AVPlayerItem, along with a few additional UI components. The scrolling performance was good before iOS 26, but with the release of iOS 26, I noticed that there is significant stuttering during scrolling while creating players with a fileUrl. It happens even if use the same video file cached on disk for each cell for testing. I also started getting this kind of log messages after the players are deinitialized: <<<< PlayerRemoteXPC >>>> signalled err=-12785 at <>:1107 <<<< PlayerRemoteXPC >>>> signalled err=-12785 at <>:1095 <<<< PlayerRemoteXPC >>>> signalled err=-12785 at <>:1095 There's also another log message that I see occasionally, but I don't know what triggers it. << FigXPC >> signalled err=-16152 at <>:1683 Is there anyone else that experienced this kind of problem with the latest release? Also, I'm wondering what's the best way to resolve the issue. I could increase the size of the memory cache to something large like 100, but I'm not sure if it is an acceptable solution because: 1- There will be 100 player instance in memory at all times. 2- There will still be stuttering during the initial loading of the videos from the web. Any help is appreciated!
0
0
146
2w
CoreMediaErrorDomain error -12848
Good day. A video I created via iOS AVAssetWriter with the following settings: let videoWriterInput = AVAssetWriterInput( mediaType: .video, outputSettings: [ AVVideoCodecKey: AVVideoCodecType.hevc, AVVideoWidthKey: 1080, AVVideoHeightKey: 1920, AVVideoCompressionPropertiesKey: [ AVVideoAverageBitRateKey: 2_000_000, AVVideoMaxKeyFrameIntervalKey: 30 ], ] ) let audioWriterInput = AVAssetWriterInput( mediaType: .audio, outputSettings: [ AVFormatIDKey: kAudioFormatMPEG4AAC, AVNumberOfChannelsKey: 2, AVSampleRateKey: 44100, AVEncoderBitRateKey: 128000 ] ) When It is split into fMP4 HLS format using ffmpeg, the video is unable to be played in iOS with the following error: CoreMediaErrorDomain error -12848 However, the video is played normally in Android, Browser HLS players, and also VLC Media Player. Please assist. Thank you.
1
0
339
2w
Memory leak on processing stereoscopic video frame, makeMutablePixelBuffer()
Hi, I downloaded and ran https://developer.apple.com/documentation/realitykit/rendering-stereoscopic-video-with-realitykit and noticed that memory usage grows linearly. I replaced the sample video with a different 8k side by side video, and the app crashed almost immediately due to memory leak. it looks like the culprit is from makeMutablePixelBuffer() function and the allocated pixelBuffers are not recycled after being used. screenshot is from a physical device.
0
0
291
2w
On iOS26, in our video playback app(use AVPlayer), the sound and video are out of sync when playing after seeking.
Our app plays TS files on an iPhone. The app fragments the TS files, creates an M3U8 playlist, converts them to HLS(HTTP Live Streaming), and then uses AVPlayer to play the video content. On a device running iOS 26, after starting playback and seeking, restarting playback causes the video and audio to be out of sync (by about 2-3 seconds depending on the situation). This also occurs on iPadOS/macOS 26. This issue was not observed prior to iOS 18. We are trying to fix this issue on the app side, but we have the following questions: The behavior of AVPlayer is different between iOS 26 and previous versions. Has there been any change that could be considered? Or is it a bug? We tried pausing before seeking, but it didn’t seem to have any effect. Are there any APIs or workarounds that can improve this? We would appreciate it if you could tell us any other helpful documents or URLs.
0
0
107
1w
Broadcast UploadExtension Stop data transmission
Currently, I am using the Broadcast UploadExtension function to obtain samplebuffer data through APP Group and IPC (based on the local Unix Domain Socket) The screen recording data transmission method of the domain socket is transmitted to the APP. However, when the APP goes back to the background to view videos in the album or other audio and video, the data transmission stops and the APP cannot obtain the screen recording data. I would like to ask how to solve this problem. I suspect that the system has suspended the extended screen recording
0
0
91
3d