I am working on a project for macOS where I am taking an AVCaptureSession's CVPixelBuffer and I need to convert it into a MTLTexture for rendering. On macOS the pixel format is 2vuy, there does not seem to be a clear format conversion while converting to a metal texture. I have been able to convert it to a texture but the color space seems to be off as it is rendering distorted colors with a double image.
I believe 2vuy is a single pane color space and I have tried to account for that, but I am unaware of what is off.
I have attached The CVPixelBuffer and The distorted MTLTexture along with a laundry list of errors.
On iOS my conversions are fine, it is only the macOS 2vuy pixel format that seems to have issues.
My code for the conversion is also attached.
If there are any suggestions or guidance on how to properly convert a 2vuy CVPixelBuffer to a MTLTexture I would greatly appreciate it.
Many Thanks
Conversion_Logs.txt
ConversionCode.swift
Video
RSS for tagDive into the world of video on Apple platforms, exploring ways to integrate video functionalities within your iOS,iPadOS, macOS, tvOS, visionOS or watchOS app.
Selecting any option will automatically load the page
Post
Replies
Boosts
Views
Activity
I noticed that AVSampleBufferDisplayLayerContentLayer is not released when the AVSampleBufferDisplayLayer is removed and released.
It is possible to reproduce the issue with the simple code:
import AVFoundation
import UIKit
class ViewController: UIViewController {
var displayBufferLayer: AVSampleBufferDisplayLayer?
override func viewDidLoad() {
super.viewDidLoad()
let displayBufferLayer = AVSampleBufferDisplayLayer()
displayBufferLayer.videoGravity = .resizeAspectFill
displayBufferLayer.frame = view.bounds
view.layer.insertSublayer(displayBufferLayer, at: 0)
self.displayBufferLayer = displayBufferLayer
DispatchQueue.main.asyncAfter(deadline: .now() + 1) {
self.displayBufferLayer?.flush()
self.displayBufferLayer?.removeFromSuperlayer()
self.displayBufferLayer = nil
}
}
}
In my real project I have mutliple AVSampleBufferDisplayLayer created and removed in different view controllers, this is problematic because the amount of leaked AVSampleBufferDisplayLayerContentLayer keeps increasing.
I wonder that maybe I should use a pool of AVSampleBufferDisplayLayer and reuse them, however I'm slightly afraid that this can also lead to strange bugs.
Edit: It doesn't cause leaks on iOS 18 device but leaks on iPad Pro, iOS 17.5.1
I have an app that has a WKWebView for watching YouTube videos. When the videos are windowed the audio seems fine, positionally as well. All perfectly.
When I fullscreen the video and it goes into the native visionOS video player the audio messes up.
It will suddenly sound like it is in your ears, or maybe even just one ear channel, or the position will be wrong. It might be fine for a moment but the second I touch the controls or move the window the sound jumps across the room, away from the window, or switches to stereo.
Sometimes exiting windows entirely you will still hear the videos playing. Even if you open the window back up and go to another screen and open another video, now you hear 2 videos playing at the same time with no way to stop the first one in the background, requiring to force restart the app.
It is all sorts of glitchy. I haven't the slightest clue what is happening here. I am strongly feeling this is a visionOS bug.
I tried using AVAudioSession to change some of the sound settings, and that makes zero difference in behavior.
Multiple testers have also reported this behavior and it has been seen on both visionOS 2.3 and 2.4 betas.
Thanks for the help! This is driving me mad! It is extremely consistent behavior!
I'm working on an application that uses the iPhone camera for scientific purposes - and, as a result would like to receive video in as unprocessed format as possible.
In particular, I'm interested in getting pixel buffers that contain pretty much the bayer data as the sensor sees it - with the minimum processing of color possible.
Currently we configure the AVCaptureDevice to fix the focus and exposure, use a low ISO with no gain and set the white balance gains to 1. AVCaptureVideoDataOutput is using 32BGRA.
What I'd like to do is remove any additional color and brightness processing such that the data is effectively processed with a linear transfer function (i.e. gamma function is 1).
I thought that this might be down to using the AVCaptureDevice activeColorSpace - we currently use P3_D65 for this. But there only seems to be a few choices (e.g. sRGB, HLG_BT2020) all of which I think affect the gamma.
So:
is it possible to control or specify the gamma / transfer function when using CaptureVideoDelegate?
if not, does one of the color space settings have a defined gamma function that I can effectively reverse it from the pixel data without losing too much information?
or is there a better way to capture video-ish speed images (15-30fps) from the camera sensor that skips processing like this?
Many thanks for any suggestions.
I'm using an iPhone 15 Pro, which has switched from Lightning to USB Type-C. My iOS version is 18.3. According to Apple's documentation, AVCaptureDevice.DeviceType should support external device types.
🔗 Apple's Official Documentation:
https://developer.apple.com/documentation/avfoundation/avcapturedevice/devicetype-swift.struct/external
The documentation clearly states that iPadOS 17.0+ and iOS 17.0+ support external devices. However, in my actual tests:
On iPhone, discoverySession does not detect any external devices.
On iPad, discoverySession can detect external devices without any issues.
My Question:
Does iPhone USB-C actually support external devices (e.g., UVC cameras)?
If not, why does Apple's documentation claim that iOS 17 supports external devices instead of specifying iPadOS 17 only?
Are serialized parameters already available inside -pluginInstanceAddedToDocument via FxParameterRetrievalAPI or are they being read later?
Our streaming app uses FairPlay-protected video streams, which previously worked fine when using AVAssetResourceLoaderDelegate to provide CKCs.
Recently, we migrated to AVContentKeySession, and while everything works as expected during regular playback, we encountered an issue with AirPlay.
Our CKC has a 120-second expiry, so we renew it by calling renewExpiringResponseData..
This trigger the didProvideRenewingContentKeyRequest delegate and we respond with updated CKC.
However, when streaming via AirPlay, both video and audio freeze exactly after 120 seconds.
To validate the issue, I tested with AVAssetResourceLoaderDelegate and found that I can reproduce the same freeze if I do not renew the key. This suggests that AirPlay is not accepting the renewed CKC when using AVContentKeySession.
Additional Details:
This issue occurs across different iOS versions and various AirPlay devices.
The same content plays without issues when played directly on the device.
The renewal process is successful, and segments continue to load, but playback remains frozen.
Tried renewing the CKC bit early (100s).
I also tried setting player.usesExternalPlaybackWhileExternalScreenIsActive = true, but the issue persists.
We don't use persistentKey.
Is there anything else that needs to be considered for proper key renewal when AirPlaying?
Any help on how to fix this or confirmation if this is a known issue would be greatly appreciated.
Hello,
Is there a way to handle 403 error returned by the server, eg token expired ?
Cannot find any information about this and everything that I tried wasn't working (addObserver, NotificationCenter with .AVPlayerItemNewErrorLogEntry, AVPlayerItemPlaybackStalled, ...)
Thank you very much.
Topic:
Media Technologies
SubTopic:
Video
No video is playing and the same error is coming. I have tried everything by resetting. Please get an update as soon as possible.
error-The operation couldn't be completed. (CoreMediaErrorDomain error -42709.)
When we use AVPlayer to play DRM encrypted streams, it will not play normally under iOS 17.6.1 system version, and there is a high probability of system restart. This is the relevant core error log:
error 14:47:53.323369+0800 audiomxd [AirPlayError] carManager_copyProperty_block_invoke:499: got error -12784/0xFFFFCE10 kCMBaseObjectError_PropertyNotFound
error 14:47:53.323414+0800 audiomxd [SPEndpointManagerFactory] SidePlay Endpoint Manager creation failed with -72390/0xFFFEE53A
error 14:47:53.364949+0800 audiomxd [APBrowserCarSessionHelper] [0xF6AA] [Bonjour/WiFi] Unrecognized ConnectivityHelper event 101
error 14:47:53.375313+0800 audiomxd AddInstanceForFactory: No factory registered for id <CFUUID 0xa5c5118c0> F8BB1C28-BAE8-11D6-9C31-00039315CD46
We're experiencing significant issues with AVPlayer when attempting to play partially downloaded HLS content in offline mode. Our app downloads HLS video content for offline viewing, but users encounter the following problems:
Excessive Loading Delay: When offline, AVPlayer attempts to load resources for up to 60 seconds before playing the locally available segments
Asset Loss: Sometimes AVPlayer completely loses the asset reference and fails to play the video on subsequent attempts
Inconsistent Behavior: The same partially downloaded asset might play immediately in one session but take 30+ seconds in another
Network Activity Despite Offline Settings: Despite configuring options to prevent network usage, AVPlayer still appears to be attempting network connections
These issues severely impact our offline user experience, especially for users with intermittent connectivity.
Technical Details
Implementation Context
Our app downloads HLS videos for offline viewing using AVAssetDownloadTask. We store the downloaded content locally and maintain a dictionary mapping of file identifiers to local paths. When attempting to play these videos offline, we experience the described issues.
Current Implementation
Here's our current implementation for playing the videos:
- (void)presentNativeAvplayerForVideo:(Video *)video navContext:(NavContext *)context {
NSString *localPath = video.localHlsPath;
if (localPath) {
NSURL *videoURL = [NSURL URLWithString:localPath];
NSDictionary *options = @{
AVURLAssetPreferPreciseDurationAndTimingKey: @YES,
AVURLAssetAllowsCellularAccessKey: @NO,
AVURLAssetAllowsExpensiveNetworkAccessKey: @NO,
AVURLAssetAllowsConstrainedNetworkAccessKey: @NO
};
AVURLAsset *asset = [[AVURLAsset alloc] initWithURL:videoURL options:options];
AVPlayerViewController *playerViewController = [[AVPlayerViewController alloc] init];
NSArray *keys = @[@"duration", @"tracks"];
[asset loadValuesAsynchronouslyForKeys:keys completionHandler:^{
dispatch_async(dispatch_get_main_queue(), ^{
AVPlayerItem *playerItem = [AVPlayerItem playerItemWithAsset:asset];
AVPlayer *player = [AVPlayer playerWithPlayerItem:playerItem];
playerViewController.player = player;
[player play];
});
}];
playerViewController.modalPresentationStyle = UIModalPresentationFullScreen;
[context presentViewController:playerViewController animated:YES completion:nil];
}
}
Attempted Solutions
We've tried several approaches to mitigate these issues:
Modified Asset Options:
NSDictionary *options = @{
AVURLAssetPreferPreciseDurationAndTimingKey: @NO, // Changed to NO
AVURLAssetAllowsCellularAccessKey: @NO,
AVURLAssetAllowsExpensiveNetworkAccessKey: @NO,
AVURLAssetAllowsConstrainedNetworkAccessKey: @NO,
AVAssetReferenceRestrictionsKey: @(AVAssetReferenceRestrictionForbidRemoteReferenceToLocal)
};
Skipped Asynchronous Key Loading:
AVPlayerItem *playerItem = [AVPlayerItem playerItemWithAsset:asset automaticallyLoadedAssetKeys:nil];
Modified Player Settings:
player.automaticallyWaitsToMinimizeStalling = NO;
[playerItem setPreferredForwardBufferDuration:2.0];
Added Network Resource Restrictions:
playerItem.canUseNetworkResourcesForLiveStreamingWhilePaused = NO;
Used File URLs Instead of HTTP URLs where possible
Despite these attempts, the issues persist.
Expected vs. Actual Behavior
Expected Behavior:
AVPlayer should immediately begin playback of locally available HLS segments
When offline, it should not attempt to load from network for more than a few seconds
Once an asset is successfully played, it should be reliably available for future playback
Actual Behavior:
AVPlayer waits 10-60 seconds before playing locally available segments
Network activity is observed despite all network-restricting options
Sometimes the player fails completely to play a previously available asset
Behavior is inconsistent between playback attempts with the same asset
Questions:
What is the recommended approach for playing partially downloaded HLS content offline with minimal delay?
Is there a way to force AVPlayer to immediately use available local segments without attempting to load from the network?
Are there any known issues with AVPlayer losing references to locally stored HLS assets?
What diagnostic steps would you recommend to track down the specific cause of these delays?
Does AVFoundation have specific timeouts for offline HLS playback that could be configured?
Any guidance would be greatly appreciated as this issue is significantly impacting our user experience.
Device Information
iOS Versions Tested: 14.5 - 18.1
Device Models: iPhone 12, iPhone 13, iPhone 14, iPhone 15
Xcode Version: 15.3-16.2.1
It's 2025, and I see that trends in video storage and streaming have changed significantly. Nowadays, CDN combined with domain-based video protection is the most popular solution.
Does anyone have more insights into this technology or real-world experience with it?
Topic:
Media Technologies
SubTopic:
Video
Hi,
I am recording video using my app. And setting up fps also using below code. But sometime video is being recorded using 20 FPS. Can someone please let me know what I am doing wrong?
private func eightBitVariantOfFormat() -> AVCaptureDevice.Format? {
let activeFormat = self.videoDeviceInput.device.activeFormat
let fpsToBeSupported: Int = 60
debugPrint("fpsToBeSupported - \(fpsToBeSupported)" as AnyObject)
let allSupportedFormats = self.videoDeviceInput.device.formats
debugPrint("all formats - \(allSupportedFormats)" as AnyObject)
let activeDimensions = CMVideoFormatDescriptionGetDimensions(activeFormat.formatDescription)
debugPrint("activeDimensions - \(activeDimensions)" as AnyObject)
let filterBasedOnDimensions = allSupportedFormats.filter({ (CMVideoFormatDescriptionGetDimensions($0.formatDescription).width == activeDimensions.width) && (CMVideoFormatDescriptionGetDimensions($0.formatDescription).height == activeDimensions.height) })
if filterBasedOnDimensions.isEmpty {
// Dimension not found. Required format not found to handle.
debugPrint("Dimension not found" as AnyObject)
return activeFormat
}
debugPrint("filterBasedOnDimensions - \(filterBasedOnDimensions)" as AnyObject)
let filterBasedOnMaxFrameRate = filterBasedOnDimensions.compactMap({ format in
let videoSupportedFrameRateRanges = format.videoSupportedFrameRateRanges
if !videoSupportedFrameRateRanges.isEmpty {
let contains = videoSupportedFrameRateRanges.contains(where: { Int($0.maxFrameRate) >= fpsToBeSupported })
if contains {
return format
} else {
return nil
}
} else {
return nil
}
})
debugPrint("allFormatsToBeSupported - \(filterBasedOnMaxFrameRate)" as AnyObject)
guard !filterBasedOnMaxFrameRate.isEmpty else {
debugPrint("Taking default active format as nothing found when filtered using desired FPS" as AnyObject)
return activeFormat
}
var formatToBeUsed: AVCaptureDevice.Format!
if let four_two_zero_v = filterBasedOnMaxFrameRate.first(where: { CMFormatDescriptionGetMediaSubType($0.formatDescription) == kCVPixelFormatType_420YpCbCr8BiPlanarVideoRange}) {
// 'vide'/'420v'
formatToBeUsed = four_two_zero_v
} else {
// Take the first one from above array.
formatToBeUsed = filterBasedOnMaxFrameRate.first
}
do {
try self.videoDeviceInput.device.lockForConfiguration()
self.videoDeviceInput.device.activeFormat = formatToBeUsed
self.videoDeviceInput.device.activeVideoMinFrameDuration = CMTimeMake(value: 1, timescale: Int32(fpsToBeSupported))
self.videoDeviceInput.device.activeVideoMaxFrameDuration = CMTimeMake(value: 1, timescale: Int32(fpsToBeSupported))
if videoDeviceInput.device.isFocusModeSupported(.continuousAutoFocus) {
self.videoDeviceInput.device.focusMode = AVCaptureDevice.FocusMode.continuousAutoFocus
}
self.videoDeviceInput.device.unlockForConfiguration()
} catch let error {
debugPrint("\(error)" as AnyObject)
}
return formatToBeUsed
}
We are using AVAssetDownloadURLSession to download content with multiple audio tracks. Still, we are facing an issue where only one audio language is being downloaded, despite explicitly requesting multiple audio languages. However, all subtitle variants are being downloaded successfully.
Issue Details:
Observed Behaviour: When initiating a download using AVAssetDownloadURLSession, only one audio track (Hindi, in this case) is downloaded, even though the content contains multiple audio tracks.
Expected Behaviour: All requested audio tracks should be downloaded, similar to how subtitle variants are successfully downloaded.
Please find sample app implementation details: https://drive.google.com/file/d/1DLcBGNnuWFYsY0cipzxpIHqZYUDJujmN/view?usp=sharing
Manifest file for the asset looks something like below
#EXTM3U
#EXT-X-VERSION:6
#EXT-X-INDEPENDENT-SEGMENTS
#EXT-X-MEDIA:TYPE=AUDIO,GROUP-ID="A1",NAME="Hindi",LANGUAGE="hi",URI="indexHindi/Hindi.m3u8",AUTOSELECT=YES,DEFAULT=YES
#EXT-X-MEDIA:TYPE=AUDIO,GROUP-ID="A2",NAME="Bengali",LANGUAGE="bn",URI="indexBengali/Bengali.m3u8",AUTOSELECT=YES,DEFAULT=YES
#EXT-X-MEDIA:TYPE=AUDIO,GROUP-ID="A3",NAME="Kannada",LANGUAGE="kn",URI="indexKannada/Kannada.m3u8",AUTOSELECT=YES,DEFAULT=YES
#EXT-X-MEDIA:TYPE=AUDIO,GROUP-ID="A4",NAME="Malayalam",LANGUAGE="ml",URI="indexMalayalam/Malayalam.m3u8",AUTOSELECT=YES,DEFAULT=YES
#EXT-X-MEDIA:TYPE=AUDIO,GROUP-ID="A5",NAME="Tamil",LANGUAGE="ta",URI="indexTamil/Tamil.m3u8",AUTOSELECT=YES,DEFAULT=YES
#EXT-X-MEDIA:TYPE=AUDIO,GROUP-ID="A6",NAME="Telugu",LANGUAGE="te",URI="indexTelugu/Telugu.m3u8",AUTOSELECT=YES,DEFAULT=YES
#EXT-X-STREAM-INF:BANDWIDTH=832196,AVERAGE-BANDWIDTH=432950,RESOLUTION=640x360,FRAME-RATE=25.0,CODECS="hvc1.2.4.L93.b0,mp4a.40.2",AUDIO="A1",SUBTITLES="subs"
index-4k360p/360p.m3u8
#EXT-X-STREAM-INF:BANDWIDTH=832196,AVERAGE-BANDWIDTH=432950,RESOLUTION=640x360,FRAME-RATE=25.0,CODECS="hvc1.2.4.L93.b0,mp4a.40.2",AUDIO="A2",SUBTITLES="subs"
index-4k360p/360p.m3u8
#EXT-X-STREAM-INF:BANDWIDTH=832196,AVERAGE-BANDWIDTH=432950,RESOLUTION=640x360,FRAME-RATE=25.0,CODECS="hvc1.2.4.L93.b0,mp4a.40.2",AUDIO="A3",SUBTITLES="subs"
index-4k360p/360p.m3u8
#EXT-X-STREAM-INF:BANDWIDTH=832196,AVERAGE-BANDWIDTH=432950,RESOLUTION=640x360,FRAME-RATE=25.0,CODECS="hvc1.2.4.L93.b0,mp4a.40.2",AUDIO="A4",SUBTITLES="subs"
index-4k360p/360p.m3u8
#EXT-X-STREAM-INF:BANDWIDTH=832196,AVERAGE-BANDWIDTH=432950,RESOLUTION=640x360,FRAME-RATE=25.0,CODECS="hvc1.2.4.L93.b0,mp4a.40.2",AUDIO="A5",SUBTITLES="subs"
index-4k360p/360p.m3u8
#EXT-X-STREAM-INF:BANDWIDTH=832196,AVERAGE-BANDWIDTH=432950,RESOLUTION=640x360,FRAME-RATE=25.0,CODECS="hvc1.2.4.L93.b0,mp4a.40.2",AUDIO="A6",SUBTITLES="subs"
index-4k360p/360p.m3u8
#EXT-X-STREAM-INF:BANDWIDTH=1317051,AVERAGE-BANDWIDTH=607343,RESOLUTION=854x480,FRAME-RATE=25.0,CODECS="hvc1.2.4.L93.b0,mp4a.40.2",AUDIO="A1",SUBTITLES="subs"
index-4k480p/480p.m3u8
#EXT-X-STREAM-INF:BANDWIDTH=1317051,AVERAGE-BANDWIDTH=607343,RESOLUTION=854x480,FRAME-RATE=25.0,CODECS="hvc1.2.4.L93.b0,mp4a.40.2",AUDIO="A2",SUBTITLES="subs"
index-4k480p/480p.m3u8
#EXT-X-STREAM-INF:BANDWIDTH=1317051,AVERAGE-BANDWIDTH=607343,RESOLUTION=854x480,FRAME-RATE=25.0,CODECS="hvc1.2.4.L93.b0,mp4a.40.2",AUDIO="A3",SUBTITLES="subs"
index-4k480p/480p.m3u8
#EXT-X-STREAM-INF:BANDWIDTH=1317051,AVERAGE-BANDWIDTH=607343,RESOLUTION=854x480,FRAME-RATE=25.0,CODECS="hvc1.2.4.L93.b0,mp4a.40.2",AUDIO="A4",SUBTITLES="subs"
index-4k480p/480p.m3u8
#EXT-X-STREAM-INF:BANDWIDTH=1317051,AVERAGE-BANDWIDTH=607343,RESOLUTION=854x480,FRAME-RATE=25.0,CODECS="hvc1.2.4.L93.b0,mp4a.40.2",AUDIO="A5",SUBTITLES="subs"
index-4k480p/480p.m3u8
#EXT-X-STREAM-INF:BANDWIDTH=1317051,AVERAGE-BANDWIDTH=607343,RESOLUTION=854x480,FRAME-RATE=25.0,CODECS="hvc1.2.4.L93.b0,mp4a.40.2",AUDIO="A6",SUBTITLES="subs"
index-4k480p/480p.m3u8
#EXT-X-STREAM-INF:BANDWIDTH=1715498,AVERAGE-BANDWIDTH=717018,RESOLUTION=1024x576,FRAME-RATE=25.0,CODECS="hvc1.2.4.L123.b0,mp4a.40.2",AUDIO="A1",SUBTITLES="subs"
index-4k576p/576p.m3u8
#EXT-X-STREAM-INF:BANDWIDTH=1715498,AVERAGE-BANDWIDTH=717018,RESOLUTION=1024x576,FRAME-RATE=25.0,CODECS="hvc1.2.4.L123.b0,mp4a.40.2",AUDIO="A2",SUBTITLES="subs"
index-4k576p/576p.m3u8
#EXT-X-STREAM-INF:BANDWIDTH=1715498,AVERAGE-BANDWIDTH=717018,RESOLUTION=1024x576,FRAME-RATE=25.0,CODECS="hvc1.2.4.L123.b0,mp4a.40.2",AUDIO="A3",SUBTITLES="subs"
index-4k576p/576p.m3u8
#EXT-X-STREAM-INF:BANDWIDTH=1715498,AVERAGE-BANDWIDTH=717018,RESOLUTION=1024x576,FRAME-RATE=25.0,CODECS="hvc1.2.4.L123.b0,mp4a.40.2",AUDIO="A4",SUBTITLES="subs"
index-4k576p/576p.m3u8
#EXT-X-STREAM-INF:BANDWIDTH=1715498,AVERAGE-BANDWIDTH=717018,RESOLUTION=1024x576,FRAME-RATE=25.0,CODECS="hvc1.2.4.L123.b0,mp4a.40.2",AUDIO="A5",SUBTITLES="subs"
index-4k576p/576p.m3u8
#EXT-X-STREAM-INF:BANDWIDTH=1715498,AVERAGE-BANDWIDTH=717018,RESOLUTION=1024x576,FRAME-RATE=25.0,CODECS="hvc1.2.4.L123.b0,mp4a.40.2",AUDIO="A6",SUBTITLES="subs"
index-4k576p/576p.m3u8
#EXT-X-MEDIA:TYPE=SUBTITLES,GROUP-ID="subs",NAME="English",DEFAULT=YES,AUTOSELECT=YES,FORCED=NO,LANGUAGE="en",URI="subtitle_en/sub_en_vtt.m3u8"
I have been using SDAVAssetExportSession to compress videos in an app I am building, everything goes very smoothly until I have my new Iphone16, on the device, the spatial audio in camera setting is turned on by default, then the SDAVAssetExportSession starts to fail. I know it has something to do with audioSetting. the current setting is something like this:
exportSession.audioSettings = [
AVFormatIDKey: kAudioFormatMPEG4AAC,
AVNumberOfChannelsKey: 2,
AVSampleRateKey: 44100,
AVEncoderBitRateKey: 128000
]
And also, this is passed to the underlying object AVAssetReader or AVAssetWriter. I am not experienced in this area, and I really had a hard time trying to figure out.
Does anyone know how to set up AVAssetReader or AVAssetWriter to process video with spatial audio tracks ? thanks in advance.
Hi,
I am recording a video at 240 FPS within my application and saving it to the Photos app. The recorded video retains 240 FPS in the Photos app. However, after trimming the video using the Photos app and importing it back into my app, the FPS is reduced to 30 FPS.
Steps to Reproduce:
Record a video inside the application at 240 FPS.
Save the recorded video to the Photos app.
Verify that the video retains 240 FPS in the Photos app.
Trim the video using the built-in Photos app editor.
Import the trimmed video back into the application.
The FPS of the imported video is now reduced to 30 FPS.
Code Used for Importing Video:
I am using the following code to fetch the video from the Photos app:
let options: PHVideoRequestOptions = PHVideoRequestOptions()
options.version = .current // Using `.original` preserves FPS, but I need `.current` for other changes
options.deliveryMode = .highQualityFormat
options.isNetworkAccessAllowed = true
PHImageManager.default().requestAVAsset(forVideo: self, options: options) { (avAsset, audioMix, info) in
if let urlAsset = avAsset as? AVURLAsset {
completionHandler(urlAsset.url, self)
} else {
self.askForOriginal(completionHandler: completionHandler)
}
}
Observations:
The original video retains 240 FPS until it is trimmed in the Photos app.
After trimming, the FPS automatically drops to 30 FPS when imported back into the app.
If I use options.version = .original, the FPS is preserved, but I need .current to apply other modifications.
Questions:
Is this an expected behavior of PHImageManager when requesting a video with options.version = .current?
Is there a way to preserve the original FPS while still using .current?
Are there any workarounds to extract the trimmed video without FPS reduction?
Any insights or solutions would be greatly appreciated. Thanks in advance!
operation couldn't be completed. (CoreMediaErrorDomain error -42709.)
this error shows after update
He has been coming here for the last 10 days. Please!
please correct it as soon as possible
Our iOS/AppleTV video content playback app uses AVPlayer to play HLS video streams and supports both custom and system playback UIs. The Fairplay content key is retrieved using AVContentKeySession. AirPlay is supported too.
When the iPhone is connected to a TV through the lightning Apple Digital AV Adapter (A1438), the app is mirrored as expected.
Problem: when using an iPhone or iPad on iOS 18.1.1, FairPlay-protected HLS streams are not played and a CoreMediaErrorDomain -12035 error is received by the AVPlayerItem. Also, once the issue has occurred, the mirroring freezes (the TV indefinitely displays the app playback screen) although the app works fine on the iOS device.
The content key retrieval works as expected (I can see that 2 content key requests are made by the system by the way, probably one for the local playback and one for the adapter, as when AirPlaying) and the error is thrown after providing the AVContentKeyResponse.
Unfortunately, and as far as I know, there is not documentation on CoreMediaErrorDomain errors so I don't know what -12035 means.
The issue does not occur:
on an iPhone on iOS 17.7 (even with FairPlay-protected HLS streams)
when playing DRM-free video content (whatever the iOS version)
when using the USB-C AV Adapter (whatever the iOS version)
Also worth noting: the issue does not occur with other video playback apps such as Apple TV or Netflix although I don't have any details on the kind of streams these apps play and the way the FairPlay content key is retrieved (if any) so I don't know if it is relevant.
Context
We develop an iOS/Apple TV app that allows to play HLS+FP Live streams (custom playback UI), some of which use the same FairPlay content key id. All FairPlay content keys are requested to the same content key server.
Implementation
Despite Apple documentation warning to not reuse AVContentKeySessions, we use only one AVContentKeySession for all channels which allows the system to reuse the content key when a content key id is met again. As seen in another thread, people seems to think this is OK.
Issue
When reusing the AVContentKeySession and the user quickly tunes channels multiple times (up to 2 or 3 times per second using gestures), an inconsistency may occur where the content key request for a previous streams is asked to the delegate after a new stream is already being prepared and its AVURLAsset already assigned as the content key session AVContentKeyRecipient. Note that the previous content key recipient is removed before the new one is added.
We also have been reported for crashes (though I haven't experienced it myself) when performing multiple channels tunings which makes us think that the AVContentKeySession should definitely not been reused.
Note: On the other hand if a new AVContentKeySession is used for each stream, the system systematically requests a content key even if previous streams have used the same content key id. In this case, neither the crash nor the inconsistency issue are observed but it dramatically increases the number of calls to the content key server.
Questions
Should AVContentKeySessions definitely not be reused? Otherwise, how to handle the inconsistency issue described above?
When setting the now playing info for playing media in MPNowPlayingInfoCenter we can set artwork. But it seems the Apple API for creating the artwork is crashing on iOS 18 (FB15145734).
On iOS 17 this gave the warning that the completion handler was not run on the main thread.
I've tried to seek help here: https://stackoverflow.com/questions/78989543/swift-data-race-with-appkit-mpmediaitemartwork-function/78990231?noredirect=1#comment139277425_78990231
but it seems that it's not possible to override the completion handler and therefor it's up to Apple to fix this issue.
.task {
await MainActor.run {
let nowPlayingInfoCenter = MPNowPlayingInfoCenter.default()
var nowPlayingInfo = [String: Any]()
let image = NSImage(named: "image")!
// warning: data race detected: @MainActor function at MPMediaItemArtwork/ContentView.swift:22 was not called on the main thread
nowPlayingInfo[MPMediaItemPropertyArtwork] = MPMediaItemArtwork(boundsSize: image.size, requestHandler: { _ in
// Not on main thread here!
return image
})
nowPlayingInfoCenter.nowPlayingInfo = nowPlayingInfo
}
}
I'm wondering if there is an alternative method to set the now playing artwork?