I am doing something similar to this post
Within an AVCaptureDataOutputSynchronizerDelegate method, I create a pixelBuffer using CVPixelBufferCreate with the following attributes:
kCVPixelBufferIOSurfacePropertiesKey as String: true,
kCVPixelBufferIOSurfaceOpenGLESTextureCompatibilityKey as String: true
When I copy the data from the vImagePixelBuffer "rotatedImageBuffer", I get the following error:
Thread 10: EXC_BAD_ACCESS (code=1, address=0x14caa8000)
I get the same error with memcpy and data.copyBytes (not running them at the same time obviously).
If I use CVPixelBufferCreateWithBytes, I do not get this error. However, CVPixelBufferCreateWithBytes does not let you include attributes (see linked post above).
I am using vImage because I need the original CVPixelBuffer from the camera output and a rotated version with a different color scheme.
// Copy to pixel buffer
let attributes: NSDictionary = [
true : kCVPixelBufferIOSurfacePropertiesKey,
true : kCVPixelBufferIOSurfaceOpenGLESTextureCompatibilityKey,
]
var colorBuffer: CVPixelBuffer?
let status = CVPixelBufferCreate(kCFAllocatorDefault, Int(rotatedImageBuffer.width), Int(rotatedImageBuffer.height), kCVPixelFormatType_32BGRA, attributes, &colorBuffer)
//let status = CVPixelBufferCreateWithBytes(kCFAllocatorDefault, Int(rotatedImageBuffer.width), Int(rotatedImageBuffer.height), kCVPixelFormatType_32BGRA, rotatedImageBuffer.data, rotatedImageBuffer.rowBytes, nil, nil, attributes as CFDictionary, &colorBuffer) // does NOT produce error, but also does not have attributes
guard status == kCVReturnSuccess, let colorBuffer = colorBuffer else {
print("Failed to create buffer")
return
}
let lockFlags = CVPixelBufferLockFlags(rawValue: 0)
guard kCVReturnSuccess == CVPixelBufferLockBaseAddress(colorBuffer, lockFlags) else {
print("Failed to lock base address")
return
}
let colorBufferMemory = CVPixelBufferGetBaseAddress(colorBuffer)!
let data = Data(bytes: rotatedImageBuffer.data, count: rotatedImageBuffer.rowBytes * Int(rotatedImageBuffer.height))
data.copyBytes(to: colorBufferMemory.assumingMemoryBound(to: UInt8.self), count: data.count) // Fails here
//memcpy(colorBufferMemory, rotatedImageBuffer.data, rotatedImageBuffer.rowBytes * Int(rotatedImageBuffer.height)) // Also produces the same error
CVPixelBufferUnlockBaseAddress(colorBuffer, lockFlags)
Video
RSS for tagDive into the world of video on Apple platforms, exploring ways to integrate video functionalities within your iOS,iPadOS, macOS, tvOS, visionOS or watchOS app.
Selecting any option will automatically load the page
Post
Replies
Boosts
Views
Activity
I have added some custom views on my pip. These controls disappeared after opening the camera in the Xcode16 environment and iOS 18 system, and it was found that these custom views were not removed and seemed to be obscured. They were displayed normally in the Xcode15.4 environment. I would like to ask how to make my custom views display normally
Topic:
Media Technologies
SubTopic:
Video
I’m using ScreenCaptureKit on macOS to grab frames and measure end-to-end latency (capture → my delegate callback). For each CMSampleBuffer I read:
let pts = CMSampleBufferGetPresentationTimeStamp(sampleBuffer).seconds
to get the “capture” timestamp, and I also extract the mach-absolute display time:
let attachments = CMSampleBufferGetSampleAttachmentsArray(sampleBuffer, createIfNecessary: false) as? [[SCStreamFrameInfo: Any]]
let displayMach = attachments?.first?[.displayTime] as? UInt64
// convert mach ticks to seconds...
Then I compare both against the current time:
let now = CACurrentMediaTime()
let latencyFromPTS = now - pts
let latencyFromDisplay = now - displayTimeSeconds
But I consistently see negative values for both calculations—i.e. the PTS or displayTime often end up numerically larger than now. This suggests that the “presentation timestamp” and the mach-absolute display time are coming from a different epoch or clock domain than CACurrentMediaTime().
Questions:
Which clocks/epochs does ScreenCaptureKit use for PTS and for .displayTime?
How can I align these timestamps with CACurrentMediaTime() so that now - pts and now - displayTime reliably yield non-negative real-world latencies?
Any pointers on the correct clock conversions or APIs to use would be greatly appreciated.
When I play an HDR video in the iPhone Photos app, I can see the HDR effect obviously. But if this HDR video is played continuously for more than 30-40 minutes, the HDR effect will disappear and the brightness will be compressed to the SDR range. This issue will appear on any iPhone.
Depending on the phone, it may be 20-30 minutes, or 30-40 minutes, or even a few minutes, such as iPhone 12 mini.
Similarly, if I use AVPlayer to play and preview an HDR video, if it plays more than 30-40 minutes, the HDR effect will disappear and the screen brightness will dim. Also the currentEDRHeadroom will gradually decrease to 1
Note, test it with an HDR video longer than 1 hour, and if the video is short, please loop it.
My question is how to avoid losing the HDR effect after 30-40 minutes when I use CAMetalLayer to render any HDR video.
I use replaykit for system-level screen recording. I want to determine whether the screen is in landscape mode by calling back CMSamplebuffer, but CMSamplebuffer does not come with this information. The other several apis related to obtaining the screen orientation are also restricted by the background. I want to know whether the information of the screen rotation direction can be obtained in real time in the background
Topic:
Media Technologies
SubTopic:
Video
Hi All. I'm working on Single-Sign-On feature in my application to let customers sign into their TV Provider. I need to add Video Subscriber SSO entitlement (com.apple.developer.video-subscriber-single-sign-on) to the app, but I found out that it's a special entitlement, need to contact Apple to enable it for my Apple account. On https://developer.apple.com/account I navigated to Support -> Contact Us -> Development and Technical -> Entitlements and ask in the email about missing entitlement (ticket ID 102478794279). The support team couldn't help me, they redirected me to the operations team. I've been waiting for a few months now but they inform me to keep waiting.
Is there a better way to contact Apple and get Video Subscriber SSO entitlement in an efficient way?
I have beet taking images from the iOS video camera feed and have encountered an issue. When you take images form the wideCamera this consumes about half the phone's CPU. The same is not the case when you take images from the telephotoCamera video stream.
Is there a way of disabling the extra processing that is being done?
Topic:
Media Technologies
SubTopic:
Video
Hello,
I'm Soonwon.
We’re currently developing a UVC camera device and trying to stream MJPEG video via AVFoundation on macOS. However, we’re running into a problem with custom resolutions.
When we try to use AVFoundation on macOS to capture MJPEG video at 1000x6000, the stream is not accepted or simply doesn’t work. Lower resolutions work fine.
(Interestingly, using the same device on iPadOS, we can capture the 1000x6000 MJPEG stream successfully by using AVCaptureSessionPresetInputPriority.)
Is there any way to receive custom-resolution MJPEG streams (like 1000x6000) from a UVC device using AVFoundation on macOS?
Are there specific session presets, entitlements, or known limitations that affect MJPEG handling at custom resolutions on macOS?
Does macOS handle MJPEG differently from iPadOS in AVFoundation?
Any insight or guidance would be greatly appreciated. Thank you!
NSError *error = nil;
if ([selectedDevice lockForConfiguration:&error]) {
[session beginConfiguration];
session.sessionPreset = AVCaptureSessionPresetHigh;
bool foundFormat = false;
for (AVCaptureDeviceFormat *format in selectedDevice.formats) {
CMVideoDimensions dims = CMVideoFormatDescriptionGetDimensions(format.formatDescription);
FourCharCode pixelFormat = CMFormatDescriptionGetMediaSubType(format.formatDescription);
foundFormat = true;
if (dims.width == 1000 && dims.height == 6000) {
selectedDevice.activeFormat = format;
foundFormat = true;
break;
}
}
if(foundFormat == false)
{
NSLog(@"Failed to foundFormat : ");
[session commitConfiguration];
return false;
}
NSError* error = nil;
AVCaptureDeviceInput* input = [AVCaptureDeviceInput deviceInputWithDevice:selectedDevice error:&error];
if (error || ![session canAddInput:input])
{
NSLog(@"Failed to add video input: %@", error.localizedDescription);
[session commitConfiguration];
return false;
}
[session addInput:input];
AVCaptureVideoDataOutput* output = [[AVCaptureVideoDataOutput alloc] init];
output.alwaysDiscardsLateVideoFrames = YES;
output.videoSettings = @{ (NSString*)kCVPixelBufferPixelFormatTypeKey : @(kCVPixelFormatType_420YpCbCr8BiPlanarVideoRange) };
[output setSampleBufferDelegate:delegate queue:queue];
if ([session canAddOutput:output])
{
[session addOutput:output];
}
[session commitConfiguration];
[selectedDevice unlockForConfiguration];
} else {
NSLog(@"Failed to lock device for configuration: %@", error.localizedDescription);
}
// start~
I would like to play videos in webM format on my iPhone. I understand that it is basically impossible to play videos in webM format on an iPhone, but is there any way to display videos in webM format? I would like to know if there is an official Swift SDK or development kit released by Apple. Or if there are any third-party products, please let me know.
Recurring crash on install of any app with the new sourceVideoTrackProvider.next()
dyld[41966]: Symbol not found: _$sSo19AVAssetReaderOutputC12AVFoundationE8ProviderC4nextxSgyYaKFTjTu Referenced from: <79AA2BE0-A6B4-32F5-A804-E84BBE5D1AEA> /Users/<username>/Library/Developer/Xcode/DerivedData/TrackProviderCrash-bbbhjptcxnmfdcackxtpucnunxyc/Build/Products/Debug-maccatalyst/TrackProviderCrash.app/Contents/MacOS/TrackProviderCrash.debug.dylib Expected in: <1B847AF9-7973-3B28-95C2-09E73F6DD50B> /usr/lib/swift/libswiftAVFoundation.dylib
Can be reproduced with the current Xcode Beta 4 by running on to MacCatalyst and macOS
https://developer.apple.com/documentation/AVFoundation/converting-projected-video-to-apple-projected-media-profile
Crash goes away of you comment out lines 154-158 and 164-170 which are while let sampleBuffer = try await sourceVideoTrackProvider.next(){/*other code*/}
Can also be reproduced if you add the code below to a MacCatalyst project
import AVKit
let asset: AVURLAsset = .init(url: Bundle.main.url(forResource: "SomeVideo.mp4", withExtension: nil)!)
let videoReader = try! AVAssetReader(asset: asset)
let videoTracks = try! await asset.loadTracks(withMediaCharacteristic: .visual)
// Get the side-by-side video track.
let videoTrack = videoTracks.first!
let videoInputTrack = AVAssetReaderTrackOutput(track: videoTrack, outputSettings: nil)
let sourceVideoTrackProvider: AVAssetReaderOutput.Provider<CMReadySampleBuffer<CMSampleBuffer.DynamicContent>> = videoReader.outputProvider(for: videoInputTrack)
//Comment out this
while let sb = try! await sourceVideoTrackProvider.next() {
}
One thing I've noticed on tvOS 26 is that if you try to set the AVPlayerViewController customInfoViewControllers property while the Content Tabs are on screen, your app will crash.
*** Terminating app due to uncaught exception 'UIViewControllerHierarchyInconsistency', reason: 'trying to add child view controller that is already presented: <AVInfoPanelViewController: 0x1030cdc00>'
*** First throw call stack:
(0x18a7167bc 0x189a77510 0x18a7166a8 0x1ab425658 0x1b2ee9d54 0x1b2efcd60 0x1b2eaf3f0 0x1080f744c 0x107e021a8 0x107e01b3c 0x18de41c14 0x18de41ba8 0x18de48d28 0x18ad9e358 0x101fac5f0 0x101fc6228 0x101fe7278 0x101fbc6fc 0x101fbc63c 0x18a67a2e0 0x18a679418 0x18a673b34 0x1937e4d5c 0x1abb36588 0x1abb3ae80 0x1aae9dec4 0x108610174 0x1086100e4 0x108615140 0x189abd4d0)
I've logged a feedback (FB19554461) but it's getting awfully late in the dev cycle. So I've been trying to think of a workaround.
The problem is that customInfoViewControllers is pretty declarative in nature. There are no properties or delegate methods I am aware of that let me know when they are displaying or not.
One trick I came up with was seeing if my custom info view controller's view was "visible" or not - I put that in quotes because it turns out it can be visible even when I think it's not, as when the transport bar is scrolled to the top my custom VC still has its top pixels showing, so it gets a viewDidAppear call. So, I tried to see if my view controllers view is completely visible, ie based on the results of the GGRect contains method. And that works! But the problem is it only accounts for my own custom info view controllers, and not the standard one that Apple provides. I can't think of a way at all to know whether that is showing.
Any ideas?
Any one experience this bug, when playing a video the bluetooth headphones loose audio? my workaround is to select from one of the other audio outputs, and go back again and select the affected headphone.
Topic:
Media Technologies
SubTopic:
Video
Hi there,
I want to set the iphone camera to "S mode, or Shutter Priority" in camera terminology. Which is a semi-auto exposure model with shutter speed fixed, or set manually.
However, when setting the shutter speed manually, it disables the auto exposure.
So is there a way to keep the auto exposure on while restrict the shutter speed?
Also, I would like to keep a low frame rate, e.g. 30 fps. Would I be able to set shutter speed independent of frame rate?
Here's the code for setting up the camera
Best,
Hi,
I have an app that displays tens of short (<1mb) mp4 videos stored in a remote server in a vertical UICollectionView that has horizontally scrollable sections.
I'm caching all mp4 files on disk after downloading, and I also have a in-memory cache that holds a limited number (around 30) of players. The players I'm using are simple views that wrap an AVPlayerLayer and its AVPlayerItem, along with a few additional UI components.
The scrolling performance was good before iOS 26, but with the release of iOS 26, I noticed that there is significant stuttering during scrolling while creating players with a fileUrl. It happens even if use the same video file cached on disk for each cell for testing.
I also started getting this kind of log messages after the players are deinitialized:
<<<< PlayerRemoteXPC >>>> signalled err=-12785 at <>:1107
<<<< PlayerRemoteXPC >>>> signalled err=-12785 at <>:1095
<<<< PlayerRemoteXPC >>>> signalled err=-12785 at <>:1095
There's also another log message that I see occasionally, but I don't know what triggers it.
<< FigXPC >> signalled err=-16152 at <>:1683
Is there anyone else that experienced this kind of problem with the latest release?
Also, I'm wondering what's the best way to resolve the issue. I could increase the size of the memory cache to something large like 100, but I'm not sure if it is an acceptable solution because:
1- There will be 100 player instance in memory at all times.
2- There will still be stuttering during the initial loading of the videos from the web.
Any help is appreciated!
Good day.
A video I created via iOS AVAssetWriter with the following settings:
let videoWriterInput = AVAssetWriterInput(
mediaType: .video,
outputSettings: [
AVVideoCodecKey: AVVideoCodecType.hevc,
AVVideoWidthKey: 1080, AVVideoHeightKey: 1920,
AVVideoCompressionPropertiesKey: [
AVVideoAverageBitRateKey: 2_000_000,
AVVideoMaxKeyFrameIntervalKey: 30
],
]
)
let audioWriterInput = AVAssetWriterInput(
mediaType: .audio,
outputSettings: [
AVFormatIDKey: kAudioFormatMPEG4AAC,
AVNumberOfChannelsKey: 2,
AVSampleRateKey: 44100,
AVEncoderBitRateKey: 128000
]
)
When It is split into fMP4 HLS format using ffmpeg, the video is unable to be played in iOS with the following error:
CoreMediaErrorDomain error -12848
However, the video is played normally in Android, Browser HLS players, and also VLC Media Player.
Please assist. Thank you.
Topic:
Media Technologies
SubTopic:
Video
We build mobile apps for creators to edit their videos. Post editing the video, the creator has to export the video so that it can be uploaded to Youtube. The export is a time consuming and GPU intensive process. The creator can exit the app due to various reasons like receiving the call, putting the app in background etc. This causes the export to fail :(
Keeping this limitation in mind there was an announcement from Apple that with the IOS 26 launch would start to support background GPU access. Here is the official documentation: https://developer.apple.com/documentation/BundleResources/Entitlements/com.apple.developer.background-tasks.continued-processing.gpu
When we tried using this feature, we were not able to get it to work on IOS 26. We stumbled upon this ticket(https://developer.apple.com/forums/thread/797538?answerId=854825022#854825022) in the Apple Developer forum, in which possibly an Apple engineer claims it is supported ONLY for iPadOS 26. This is a very big bummer for us.
96% of the users are on iPhone(compared to iPad), and if we refer to the official documentation above, it claims that this feature should work on IOS 26.
This feature is extremely important for having the best user experience and reducing user frustration and will be useful for other video editing apps.
Looking forward to a resolution.
Topic:
Media Technologies
SubTopic:
Video
Is there sample code on how to use VTFrameProcessorOpticalFlow?
Topic:
Media Technologies
SubTopic:
Video
How to USE iphone16 pro to capture HDR dolby_vision and use videotoolbox to encode with dolby_vision metadata in bitstream!
I know apple ios system can achieve this target,but I want to get source code how to do that ?
Topic:
Media Technologies
SubTopic:
Video
I’m implementing FairPlay offline streaming on iOS and ran into a question about DRM expiration handling.
As far as I understand, when issuing a FairPlay offline license, there are typically two time windows:
1. The period during which the user can start offline playback (the longer “rental window”).
2. Once playback starts, the duration allowed to complete playback (the shorter “playback window”).
I’d like to display this information (the remaining validity or expiration time) in the app’s UI next to each downloaded asset.
My question is:
👉 Is there a way to programmatically check or retrieve the expiration time for a FairPlay offline asset on the client side (via AVFoundation or AVContentKeySession)?
Any guidance or best practices for surfacing DRM expiration info in the UI would be greatly appreciated.
it seems that video toolbox in ios26 is having trouble decoding h.264 streams I have filed a bug report under https://feedbackassistant.apple.com/feedback/20741484
Topic:
Media Technologies
SubTopic:
Video