What options do I have if I don't want to use Blackmagic's Camera ProDock as the external Sync Hardware, but instead I want to create my own USB-C hardware accessory which would show up as an AVExternalSyncDevice on the iPhone 17 Pro?
Which protocol does my USB-C device have to implement to show up as an eligible clock device in AVExternalSyncDevice.DiscoverySession?
Photos & Camera
RSS for tagExplore technical aspects of capturing high-quality photos and videos, including exposure control, focus modes, and RAW capture options.
Selecting any option will automatically load the page
Post
Replies
Boosts
Views
Activity
I tested the accuracy of the depth map on iPhone 12, 13, 14, 15, and 16, and found that the variance of the depth map after iPhone 12 is significantly greater than that of iPhone 12.
Enabling depth filtering will cause the depth data to be affected by the previous frame, adding more unnecessary noise, especially when the phone is moving.
This is not friendly for high-precision reconstruction. I tried to add depth map smoothing in post-processing to solve the problem of large depth map deviation, but the performance is still poor.
Is there any depth map smoothing solutions already announced by Apple?
In iOS 26 (Developer Beta), the AVCaptureMetadataOutputObjectsDelegate no longer receives callbacks when metadataOutput.metadataObjectTypes = [.face] is set. On earlier iOS versions the issue does not occur. Interestingly, face detection works if I set the sessionPreset to .medium, but not with .high — except on the iPhone 16 Pro Max, where it works regardless.
I'm creating an app that uses AVCaptureSession to pass camera input to AVCaptureMetadataOutput type set [metaout setMetadataObjectTypes:@[AVMetadataObjectTypeFace]] and scan Face.
After updating to OS 26 Beta2 and iOS 26 Beta2, an issue has occurred where the delegate method of AVCaptureMetadataOutputObjectsDelegate is not called on some devices.
The following devices are experiencing this issue.
iPad (9th Gen)
iPad air (4th Gen)
iPhone 15
This issue has not occur on any other devices I have.
I tried running the AVFoundation sample code on the Apple Developer site on the above device. The same problem still occurs. [https://developer.apple.com/documentation/avfoundation/capture_setup/avcambarcode_detecting_barcodes_and_faces]
Are any additional settings required after OS 26 beta and iOS 26 beta? Or is there some problem on the OS side?
I tried to modify the AVCam sample code by copying the code here https://developer.apple.com/documentation/avfoundation/adopting-smart-framing-in-your-camera-app#Configure-the-smart-framing-monitor
smart framing monitors
I can ensure the activeformat supports smart framing, but the supported frames in monitor is always nil.
In my another project it has supported value, but the observation has never been triggered, then I tried to keep printing the recommended frame, it's always nil.
Could the engineer embed the code into AVCam rather than posting a few code pieces?
I'm receiving output from avcapturesession and capturing an image using Vision, but the image is output in landscape orientation instead of portrait.
Even when I set the orientation to up in ciimage, cgimage, and uiimage, the image is still output in landscape orientation.
On iPhones 16 and below, the image is output in portrait orientation.
But on iPhones 17 and above, the image is output in landscape orientation.
Please help.
Hello everyone,
I'm working on a feature where I need to capture the highest possible quality photo (e.g., 24MP on supported devices) and upload it to our server. I don't need the photos to appear in user's main Photos app so I thought I could store the photos in app's private directory using FileManager until they are uploaded. This wouldn't require requesting Photo Library permission, maximizing user privacy.
The documentation on AVCapturePhotoOutput states that "the 24MP setting (5712, 4284) is only serviced as 24MP when opted-in to autoDeferredPhotoDeliveryEnabled"
/**
@property maxPhotoDimensions
@abstract
Indicates the maximum resolution of the requested photo.
@discussion
Set this property to enable requesting of images up to as large as the specified dimensions. Images returned by AVCapturePhotoOutput may be smaller than these dimensions but will never be larger. Once set, images can be requested with any valid maximum photo dimensions by setting AVCapturePhotoSettings.maxPhotoDimensions on a per photo basis. The dimensions set must match one of the dimensions returned by AVCaptureDeviceFormat.supportedMaxPhotoDimensions for the current active format. Changing this property may trigger a lengthy reconfiguration of the capture render pipeline so it is recommended that this is set before calling -[AVCaptureSession startRunning].
Note: When supported, the 24MP setting (5712, 4284) is only serviced as 24MP when opted-in to autoDeferredPhotoDeliveryEnabled.
*/
@available(iOS 16.0, *)
open var maxPhotoDimensions: CMVideoDimensions
(btw. this note is not present in the docs https://developer.apple.com/documentation/avfoundation/avcapturephotooutput/maxphotodimensions)
Enabling autoDeferredPhotoDeliveryEnabled means that for a 24MP capture, the system will call the photoOutput(_:didFinishCapturingDeferredPhotoProxy:error:) delegate method, providing a proxy object instead of the final image data.
According to the WWDC23 session "Create a more responsive camera experience," this AVCaptureDeferredPhotoProxy must be saved to the PHPhotoLibrary using a PHAssetCreationRequest with the resource type .photoProxy. The system then handles the final processing in the background within the library.
To use deferred photo processing, you'll need to have write permission to the photo library to store the proxy photo, and read permission if your app needs to show the final photo or wants to modify it in any way.
https://developer.apple.com/videos/play/wwdc2023/10105/?time=799
This seems to create a hard dependency on the Photo Library for accessing 24MP images.
My question is:
Is there any way to receive the final, processed 24MP image data directly in the app after a deferred capture, without using PHPhotoLibrary as the processing intermediary?
For example, is there a delegate callback or a mechanism I'm missing that provides the final data for a deferred photo, allowing an app to handle it in-memory or in its own private sandbox, completely bypassing the user's Photo Library?
Our goal is to follow Apple's privacy-first principles by avoiding requesting a PHPhotoLibrary authorization when our app's core function doesn't require access to the user's photo collection.
Thank you for your time and any clarification you can provide.
I want to use both front UW and TrueDepth cameras in iPad which has front UW camera.
Firstly, I have used only front builtInDualCamera by AVFoundation and tried all the formats that can be used with builtInDualCamera, but there was no format that could capture UW.
Secondly, I have tried to both front builtInDualCamera and builtInUltraWideCamera, but there was no combination that could use builtInUltraWideCamera and builtInDualCamera.
Is there any way ?
I'm experiencing an issue with my app when saving images to the camera roll. This is intermittent, but it happens several times a day. The error I receive is the following:
Connection to assetsd was interrupted - assetsd exited, died, or closed the photo library
Error getting remote object proxy for -[PLNonBindingAssetsdPhotoKitClient sendChangesRequest:reply:]_block_invoke: Error Domain=NSCocoaErrorDomain Code=4097 "connection to service named com.apple.photos.service" UserInfo={NSDebugDescription=connection to service named com.apple.photos.service}
PhotoKit XPC proxy is invalid. Dropping request on the floor and returning an error: Error Domain=PHPhotosErrorDomain Code=3301 "(null)" (underlying error Error Domain=NSCocoaErrorDomain Code=4097 "connection to service named com.apple.photos.service" UserInfo={NSDebugDescription=connection to service named com.apple.photos.service})
CoreData: error: XPC: synchronousRemoteObjectProxyWithErrorHandler: store 'file:///var/mobile/Media/PhotoData/Photos.sqlite' encountered error: Error Domain=NSCocoaErrorDomain Code=4099 "The connection to service created from an endpoint was invalidated: failed to check-in, peer may have been unloaded: mach_error=10000003." UserInfo={NSDebugDescription=The connection to service created from an endpoint was invalidated: failed to check-in, peer may have been unloaded: mach_error=10000003.}
CoreData: error: XPC: synchronousRemoteObjectProxyWithErrorHandler: store 'file:///var/mobile/Media/PhotoData/Photos.sqlite' encountered error: Error Domain=NSCocoaErrorDomain Code=4099 "The connection to service created from an endpoint was invalidated: failed to check-in, peer may have been unloaded: mach_error=10000003." UserInfo={NSDebugDescription=The connection to service created from an endpoint was invalidated: failed to check-in, peer may have been unloaded: mach_error=10000003.}
My code is unchanged from using my app daily on an iPhone 16 Pro with iOS 26. I never saw the issue on this device.
Here is an excerpt from my code for saving the image:
var localIdentifier = String()
PHPhotoLibrary.shared().performChanges({
let albumChangeRequest = PHAssetCollectionChangeRequest(for: album)
let assetCreationRequest = PHAssetCreationRequest.forAsset()
let options = PHAssetResourceCreationOptions()
assetCreationRequest.addResource(with: .photo, data: imageData, options: options)
assetCreationRequest.creationDate = Date.now
let placeHolder = assetCreationRequest.placeholderForCreatedAsset
albumChangeRequest?.addAssets([placeHolder!] as NSArray)
if placeHolder != nil {
localIdentifier = (placeHolder?.localIdentifier)!
}
}) { (didSucceed, error) in
OperationQueue.main.addOperation({
didSucceed ? success(localIdentifier) : failure(error)
})
}
I'm not sure why this would be device specific but I have had users with iPhone 17 Pro and iPhone Air reporting the issue.
Alex
We observed that the PHPicker is unable to load RAW images captured on an iPhone in some scenarios. And it is also somehow related to iCloud.
Here is the setup:
The PHPickerViewController is configured with preferredAssetRepresentationMode = .current to avoid transcoding.
The image is loaded from the item provider like this:
if itemProvider.hasItemConformingToTypeIdentifier(kUTTypeImage) {
itemProvider.loadFileRepresentation(forTypeIdentifier: kUTTypeImage) { url, error in
// work
}
}
This usually works, also for RAW images. However, when trying to load a RAW image that has just been captured with the iPhone, the loading fails with the following errors on the console:
[claims] 43A5D3B2-84CD-488D-B9E4-19F9ED5F39EB grantAccessClaim reply is an error: Error Domain=NSCocoaErrorDomain Code=4097 "Couldn’t communicate with a helper application." UserInfo={NSUnderlyingError=0x2804a8e70 {Error Domain=NSCocoaErrorDomain Code=4097 "connection from pid 19420 on anonymousListener or serviceListener" UserInfo={NSDebugDescription=connection from pid 19420 on anonymousListener or serviceListener}}}
Error copying file type public.image. Error: Error Domain=NSItemProviderErrorDomain Code=-1000 "Cannot load representation of type public.image" UserInfo={NSLocalizedDescription=Cannot load representation of type public.image, NSUnderlyingError=0x280480540 {Error Domain=NSCocoaErrorDomain Code=4097 "Couldn’t communicate with a helper application." UserInfo={NSUnderlyingError=0x2804a8e70 {Error Domain=NSCocoaErrorDomain Code=4097 "connection from pid 19420 on anonymousListener or serviceListener" UserInfo={NSDebugDescription=connection from pid 19420 on anonymousListener or serviceListener}}}}}
We observed that on some devices, loading the image will actually work after a short time (~30 sec), but on others it will always fail.
We think it is related to iCloud Photos: On the device that has iCloud Photos sync enabled, the picker is able to load the image right after it was synced to the cloud. On devices that don't sync the image, loading always fails. It seems that the sync process is doing some processing (?) of the image that will later enable the picker to load it successfully, but that's just guessing.
Additional observations:
This seems to only occur for images that were taken with the stock Camera app. When using Halide to capture RAW (either ProRAW or RAW), the Picker is able to load the image.
When trying to load the image as kUTTypeRawImage instead of kUTTypeImage, it also fails.
The picker also can't load RAW images that were AirDroped from another device, unless it synced to iCloud first.
This is reproducable using the Selecting Photos and Videos in iOS sample code project.
We observed this happening in other apps that use the PHPicker, not just ours.
Is this a bug, or is there something that we are missing?
Topic:
Media Technologies
SubTopic:
Photos & Camera
Tags:
PhotoKit
Photos and Imaging
wwdc2022-10023
Hi all,
I'm working on a custom Metal-based video pipeline using AVCaptureVideoDataOutput, and I've run into an unexpected issue related to exposure.
Setup:
I'm capturing video frames using kCVPixelFormatType_420YpCbCr8BiPlanarFullRange.
In my Metal shader, I:
Convert YCbCr (full range Rec.709) to linear Rec.709 RGB.
Apply Rec.709 → sRGB gamma encoding.
Output to .bgra8Unorm_srgb via MTKView.
Everything renders correctly in terms of colorspace math, but the image appears significantly brighter (~+3 stops EV) compared to AVCaptureVideoPreviewLayer and the native iOS Camera app under the same camera exposure settings.
What I’ve verified:
The color transforms are correct: YCbCr709 to RGB, then linear to sRGB.
I'm not applying any tone mapping or aggressive look LUTs yet.
Camera exposure is locked using:
device.setExposureModeCustom(duration: ..., iso: ...)
The same EV (e.g., ISO 50, 1/125s, f/5.6) on my iPhone appears visually 3 stops brighter than on my digital cameras (Sony/Canon etc).
To match the look of the preview layer or camera app, I have to simulate a ~–3 EV shift in my custom pipeline.
Questions:
Is AVCaptureVideoPreviewLayer applying extra tone mapping, digital gain, or contrast shaping (like OOTF etc)?
Does the camera ISP expose "hotter" (i.e., with more light) internally for the preview layer than what we get in video frame buffers?
Is there a standard way to compensate for this ISP behavior in custom pipelines using AVCaptureVideoDataOutput?
Can this be accounted for using metadata (e.g., exposure bias, gain, gamma curve)?
Topic:
Media Technologies
SubTopic:
Photos & Camera
Error capturing ProRAW using iPhone 17 Pro Telephoto with photoQualityPrioritization set to .Quality
Hey,
I'm having a very strange issue on my iPhone 17 Pro. I'm trying to capture a 12MP ProRAW image using the Telephoto Lens with the photoQualityPrioritization set to .Quality. Unfortunately I receive this error when trying to capture the image:
Error Domain=AVFoundationErrorDomain Code=-11800 "The operation could not be completed" UserInfo={NSUnderlyingError=0x134f7a1f0 {Error Domain=NSOSStatusErrorDomain Code=-16802 "(null)"}, NSLocalizedFailureReason=An unknown error occurred (-16802), AVErrorRecordingFailureDomainKey=4, NSLocalizedDescription=The operation could not be completed}
The photo captures correctly at 7.9x zoom, it's only a problem when the zoom goes over 8x.
Also, it's only this particular configuration of settings which causes the issue. I'm able to capture an image if I either:
Set quality to ".balanced"
Set max dimensions to 48MP
Capture a JPEG image instead of a ProRAW image
Use the TripleCamera fusion lens
Any help would be greatly appreciated.
Alex
I am able to capture 48mp photos using .builtInWideAngleCamera, but it seems like .builtInTripleCamera is capped at 12mp?
Is there a way to capture 48mp photos using .builtInTripleCamera? Because .builtInTripleCamera provides smooth transition between cameras during zooming, and I'd like to keep this behavior.
New iPhone 17 Pro have all their cameras at 48mp. Is there a chance that their .builtInTripleCamera is capable of capturing 48mp? Or is this an API limitation?
When attempting to access a PHAsset that is in the hidden folder of iOS26, the PHFetchResult always returns no items, even when the user has granted full access to photos and even when includeHiddenAssets is true.
This is the code suggested by ChatGPT; it always fails:
public func fetchAsset(withLocalIdentifier identifier: String) -> PSSPHAssetImplementing? {
// First try the direct fetch by identifier (fast path)
let directResult = PHAsset.fetchAssets(withLocalIdentifiers: [identifier], options: nil)
if let asset = directResult.firstObject {
return build(from: asset)
}
// Fallback: fetch all assets including hidden, then filter manually
let options = PHFetchOptions()
options.includeHiddenAssets = true
let allAssets = PHAsset.fetchAssets(with: options)
for index in 0..<allAssets.count {
let asset = allAssets.object(at: index)
if asset.localIdentifier == identifier {
return build(from: asset)
}
}
return nil
}
Is it no longer possible to retrieve a hidden photo in iOS 26?
I'm working on a photo app and I want to allow the user to display, edit and delete photos. I can fetch all photos using PHAsset.fetchAssets(with: options). This works as intended.
However, I can't seem to find a way to prevent the user from seeing photos from a Shared Library. The PHAssetSourceType only contains typeCloudShared to only show items from a specific album; not library.
How can I filter by iCloud Shared Library?
Area
ImageCaptureCore / ICDeviceBrowser
Description
On iOS 26.1 beta, calling
requestControlAuthorization()
requestContentsAuthorization()
always returns .notDetermined and never transitions to .authorized or .denied.
This prevents apps from properly accessing device control or contents authorization. The issue occurs regardless of device state or prior requests.
Steps to Reproduce
1. Create and start an ICDeviceBrowser instance.
2. Call requestControlAuthorization() or requestContentsAuthorization().
3. Inspect the returned ICAuthorizationStatus.
Expected Result
• The system should prompt the user if necessary.
• A final status of either .authorized or .denied should be returned.
Actual Result
• The completion handler always reports .notDetermined.
• No user prompt appears and the status does not change.
Version / Build
• iOS 26.1 beta
• Xcode
Hardware
• [iPhone 15 Pro, iPad Pro (M2)]
Impact
This regression blocks development and testing of features relying on ImageCaptureCore. Applications depending on device browsing and content access cannot proceed, which significantly affects workflows involving external device integration.
Notes
This appears to be a regression compared to earlier iOS releases.
I want to fully support the new iPhone models in my app, and ideally need to know the available lenses. However, I can't find information about this on the web and they're not reported in the simulators. The closest thing I found was this, but it's very out of date. https://developer.apple.com/library/archive/documentation/DeviceInformation/Reference/iOSDeviceCompatibility/Cameras/Cameras.html
My only other option is to buy each device, which isn't really feasible, or to log the data from real users via an analytics tool which isn't ideal either.
Thanks,
Alex
Topic:
Media Technologies
SubTopic:
Photos & Camera
I discovered when editing photos with the PhotoKit API, PHContentEditingOutput's renderedContentURL is a file in the app container's tmp directory with a filename that seems to follow the format render.<uuid>.JPG, and that file does not get deleted if the edit does not complete successfully (the user cancels the edit request, an error occurs, the app crashes, etc). I understand the system is supposed to automatically delete tmp files every once in a while, but some users are noticing my app's Documents & Data inflates, so I'm considering deleting these render files each time the app is launched. But I don't want to delete everything in the tmp directory as there could possibly be other data in there.
What's the best way to remove those temporary files? Does the filename always start with render. no matter the device language? I thought I'd delete files in NSTemporaryDirectory() with that prefix but then I discovered in Mac Catalyst the location is not the tmp directory directly, they're in tmp/TemporaryItems/<bundleid>.
Thanks!
Hi everyone,
I’m running into an issue with PHPickerFilter when using PHPickerViewController.
When I configure the picker with a .videos and .livePhotos filter, it seems to work correctly in the Photos tab. However, when I switch to the Collections tab, the filter doesn’t always apply — users can still see and select static image assets in certain collections (e.g. from one of the People & Pets sections).
Here’s a simplified snippet of my setup:
var configuration = PHPickerConfiguration(photoLibrary: .shared())
configuration.selectionLimit = 1
var filters = [PHPickerFilter]()
filters.append(.videos)
filters.append(.livePhotos)
configuration.filter = PHPickerFilter.any(of: filters)
configuration.preferredAssetRepresentationMode = .current
let picker = PHPickerViewController(configuration: configuration)
picker.delegate = self
present(picker, animated: true)
Expected behavior:
The picker should consistently respect the filter across both Photos and Collections tabs, only showing assets that match the filter.
Actual behavior:
The filter seems to apply correctly in the Photos tab, but in the Collections tab, other asset types are still visible/selectable.
Has anyone else encountered this behavior? Is this expected or a known issue, or am I missing something in the configuration?
Thanks in advance!
Topic:
Media Technologies
SubTopic:
Photos & Camera
Tags:
Files and Storage
Media Library
Photos and Imaging
PhotoKit
Just downloaded iOS 26.1 and my phone keeps ringing after the call has been answered. Any fixes for this?
Topic:
Media Technologies
SubTopic:
Photos & Camera