Photos & Camera

RSS for tag

Explore technical aspects of capturing high-quality photos and videos, including exposure control, focus modes, and RAW capture options.

Posts under Photos & Camera subtopic

Post

Replies

Boosts

Views

Activity

How to get a callback once a requested frameDuration change has been applied?
When changing a camera's exposure, AVFoundation provides a callback which offers the timestamp of the first frame captured with the new exposure duration: AVCaptureDevice.setExposureModeCustom(duration:, iso:, completionHandler:). I want to get a similar callback when changing frame duration. After setting AVCaptureDevice.activeVideoMinFrameDuration or AVCaptureDevice.activeVideoMinFrameDuration to a new value, how can I compute the index or the timestamp of the first camera frame which was captured using the newly set frame duration?
0
0
503
Aug ’25
iPhone 17 smart framing api not working
I tried to modify the AVCam sample code by copying the code here https://developer.apple.com/documentation/avfoundation/adopting-smart-framing-in-your-camera-app#Configure-the-smart-framing-monitor smart framing monitors I can ensure the activeformat supports smart framing, but the supported frames in monitor is always nil. In my another project it has supported value, but the observation has never been triggered, then I tried to keep printing the recommended frame, it's always nil. Could the engineer embed the code into AVCam rather than posting a few code pieces?
0
0
139
Sep ’25
Orientation does not work on iPhone 17 and above.
I'm receiving output from avcapturesession and capturing an image using Vision, but the image is output in landscape orientation instead of portrait. Even when I set the orientation to up in ciimage, cgimage, and uiimage, the image is still output in landscape orientation. On iPhones 16 and below, the image is output in portrait orientation. But on iPhones 17 and above, the image is output in landscape orientation. Please help.
0
0
90
Sep ’25
How to use Front UW and TrueDepth in iPad
I want to use both front UW and TrueDepth cameras in iPad which has front UW camera. Firstly, I have used only front builtInDualCamera by AVFoundation and tried all the formats that can be used with builtInDualCamera, but there was no format that could capture UW. Secondly, I have tried to both front builtInDualCamera and builtInUltraWideCamera, but there was no combination that could use builtInUltraWideCamera and builtInDualCamera. Is there any way ?
0
0
142
Sep ’25
Error saving image to Camera Roll on iPhone 17 Pro
I'm experiencing an issue with my app when saving images to the camera roll. This is intermittent, but it happens several times a day. The error I receive is the following: Connection to assetsd was interrupted - assetsd exited, died, or closed the photo library Error getting remote object proxy for -[PLNonBindingAssetsdPhotoKitClient sendChangesRequest:reply:]_block_invoke: Error Domain=NSCocoaErrorDomain Code=4097 "connection to service named com.apple.photos.service" UserInfo={NSDebugDescription=connection to service named com.apple.photos.service} PhotoKit XPC proxy is invalid. Dropping request on the floor and returning an error: Error Domain=PHPhotosErrorDomain Code=3301 "(null)" (underlying error Error Domain=NSCocoaErrorDomain Code=4097 "connection to service named com.apple.photos.service" UserInfo={NSDebugDescription=connection to service named com.apple.photos.service}) CoreData: error: XPC: synchronousRemoteObjectProxyWithErrorHandler: store 'file:///var/mobile/Media/PhotoData/Photos.sqlite' encountered error: Error Domain=NSCocoaErrorDomain Code=4099 "The connection to service created from an endpoint was invalidated: failed to check-in, peer may have been unloaded: mach_error=10000003." UserInfo={NSDebugDescription=The connection to service created from an endpoint was invalidated: failed to check-in, peer may have been unloaded: mach_error=10000003.} CoreData: error: XPC: synchronousRemoteObjectProxyWithErrorHandler: store 'file:///var/mobile/Media/PhotoData/Photos.sqlite' encountered error: Error Domain=NSCocoaErrorDomain Code=4099 "The connection to service created from an endpoint was invalidated: failed to check-in, peer may have been unloaded: mach_error=10000003." UserInfo={NSDebugDescription=The connection to service created from an endpoint was invalidated: failed to check-in, peer may have been unloaded: mach_error=10000003.} My code is unchanged from using my app daily on an iPhone 16 Pro with iOS 26. I never saw the issue on this device. Here is an excerpt from my code for saving the image: var localIdentifier = String() PHPhotoLibrary.shared().performChanges({ let albumChangeRequest = PHAssetCollectionChangeRequest(for: album) let assetCreationRequest = PHAssetCreationRequest.forAsset() let options = PHAssetResourceCreationOptions() assetCreationRequest.addResource(with: .photo, data: imageData, options: options) assetCreationRequest.creationDate = Date.now let placeHolder = assetCreationRequest.placeholderForCreatedAsset albumChangeRequest?.addAssets([placeHolder!] as NSArray) if placeHolder != nil { localIdentifier = (placeHolder?.localIdentifier)! } }) { (didSucceed, error) in OperationQueue.main.addOperation({ didSucceed ? success(localIdentifier) : failure(error) }) } I'm not sure why this would be device specific but I have had users with iPhone 17 Pro and iPhone Air reporting the issue. Alex
0
0
83
Sep ’25
Clean up render files saved to PHContentEditingOutput.renderedContentURL
I discovered when editing photos with the PhotoKit API, PHContentEditingOutput's renderedContentURL is a file in the app container's tmp directory with a filename that seems to follow the format render.<uuid>.JPG, and that file does not get deleted if the edit does not complete successfully (the user cancels the edit request, an error occurs, the app crashes, etc). I understand the system is supposed to automatically delete tmp files every once in a while, but some users are noticing my app's Documents & Data inflates, so I'm considering deleting these render files each time the app is launched. But I don't want to delete everything in the tmp directory as there could possibly be other data in there. What's the best way to remove those temporary files? Does the filename always start with render. no matter the device language? I thought I'd delete files in NSTemporaryDirectory() with that prefix but then I discovered in Mac Catalyst the location is not the tmp directory directly, they're in tmp/TemporaryItems/<bundleid>. Thanks!
0
0
76
Oct ’25
iOS 26 AVCaptureDevice continuousAutoFocus not working
Device: iPhone 16 Pro Max OS: iOS 26.0.1 AVCaptureDevice.DeviceType: builtInUltraWideCamera avtiveFormat: <AVCaptureDeviceFormat: 0x10ffb9ac0 'vide'/'420v' 1440x1080, { 1- 60 fps}, photo dims:{1440x1080,2016x1512}, fov:101.022, gdc fov:103.625, binned, max zoom:94.50 (upscales @1.40), system zoom range:1.0-3.0, AF System:1, ISO:15.0-3600.0, SS:0.000023-1.000000, system exposure bias range:-2.0-2.0, supports multicam, supports CS RoI, supports Smart Style, supports Smudge Detection> API: device.isFocusModeSupported(.continuousAutoFocus) == true setting: device.focusMode = .continuousAutoFocus setting is ok, but it's not working actually with continuousAutoFocus
0
0
64
Oct ’25
New API to control front camera orientation like native Camera app on iPhone 17 with iOS 26?
The iPhone 17’s front camera with the new 18MP square sensor and iOS 26’s Center Stage feature can auto-rotate between portrait and landscape like the native iOS Camera app. Is there a Swift or AVFoundation API that allows developers to manually control front camera orientation in the same way the native Camera app does? Or is this auto-rotation strictly handled by the system without public API access?
0
0
170
Oct ’25
Uploading PhotoKit Resources in the Background
The introduction of PHBackgroundResourceUploadExtension is a welcome addition in iOS 26.1. I wonder however, how to attach a debugger and actually get the system to call the process() method of the extension. I tried to run the extension both inside photos app (and also the main app for testing), but when I take a photo or add photos to the library (saving), the process() method does never get called. Any hints would be appreciated to debug the PHBackgroundResourceUploadExtension during development.
0
1
119
3w
PhotoKit Background Upload Extension not working on iOS 26.2 iPhone 17 Simulator
Hi, I’m trying to implement the new PhotoKit PHBackgroundResourceUploadExtension. I created the extension, enabled full photo library access in the host app, and registered the extension point using the string: com.apple.photos.background-upload. However, when I attempted to enable the extension with: try library.setUploadJobExtensionEnabled(true) I received the following error: Error Domain=PHPhotosErrorDomain Code=-1 "(null)" This happens when running the app on Xcode 26.1 and 26.2 Beta, using the iPhone 17 Pro Max simulator (iOS 26.1 and 26.2). My question is: Is this extension supported on the simulator? I’m asking because at the moment it’s difficult for me to test this on a physical device. Also, What's the meaning of the error? Thanks.
0
1
199
2w
Logged error/warning in FigCaptureSourceRemote when capturing a photo
I'm using this library: https://github.com/Yummypets/YPImagePicker to capture photos. I've modified it slightly, and I'm using an older version. When testing on my iPhone 16e, ios 26, whenever I take a photo, I get the following two error messages: <<<< FigXPCUtilities >>>> signalled err=-17281 at <>:302 <<<< FigCaptureSourceRemote >>>> Fig assert: "err == 0 " at bail (FigCaptureSourceRemote.m:569) - (err=-17281) These error messages appear, but as far as I can tell, the photo comes through OK, and I can save the data no problem. I've even removed all my handling code to see if it was something I was doing. I don't really want to ship with these errors showing, but I also have no idea what can be causing this error to appear. chatgpt was not helpful diagnosing this. Does anyone know what can cause this error Is there a way I can see the source code to figure out if there's something I'm doing wrong here? It really seems like this is an internal apple error, or else I would have expected more details on the error relating to the code I've written. Any clues would be appreciated!
0
0
326
6d
AVCaptureVideoPreviewLayer layoutSublayers invoked on background thread
Opening this question after discussing the issue in the AVCapture lab, hopefully so we can track down this issue. We've been noticing some crashes in App Store Connect caused by layoutSublayers being called on a background thread. After debugging the issue a bit we found that all calls which modified the AVCaptureSession or preview layer were indeed done on the main thread. It would be useful to see what results in AVCaptureVideoPreviewLayer.updateFormatDescription being called. I've attached the crashlog below. Crash log.ips - https://developer.apple.com/forums/content/attachment/800b0dba-3477-4c5a-b56c-f4cc393b384f
1
1
730
Jun ’25
IPadOS 17 external camera exposure
I'm developing iPad app that will be mostly dedicated for certain external camera for visually impaired people. The linux UVC api (e.g. using guvcview) allows to enable automatic exposure for the camera. IOs api "isExposureModeSupported" unfortunately returns false for any of the exposure modes. Is it a bug? Or perhaps AVFoundation doesn't support UVC exposure yet?
1
2
585
Jul ’25
Generating Live Photo from JPG and MOV fails
I am working on an iOS application using SwiftUI where I want to convert a JPG and a MOV file to a live photo. I am utilizing the LivePhoto Class from Github for this. The JPG and MOV files are displayed correctly in my WallpaperDetailView, but I am facing issues when trying to download the live photo to the gallery and generate the Live Photo. Here is the relevant code and the errors I am encountering: Console prints: Play button should be visible Image URL fetched and set: Optional("https://firebasestorage.googleapis.com/...") Video is ready to play Video downloaded to: file:///var/mobile/Containers/Data/Application/.../tmp/CFNetworkDownload_7rW5ny.tmp Failed to generate Live Photo I have verified that the app has the necessary permissions to access the Photo Library. The JPEG and MOV files are successfully downloaded and can be displayed in the app. The issue seems to occur when generating the Live Photo from the downloaded files. struct WallpaperDetailView: View { var wallpaper: Wallpaper @State private var isLoading = false @State private var isImageSaved = false @State private var imageURL: URL? @State private var livePhotoVideoURL: URL? @State private var player: AVPlayer? @State private var playerViewController: AVPlayerViewController? @State private var isVideoReady = false @State private var showBuffering = false var body: some View { ZStack { if let imageURL = imageURL { GeometryReader { geometry in KFImage(imageURL) .resizable() ... } } if let playerViewController = playerViewController { VideoPlayerViewController(playerViewController: playerViewController) .frame(maxWidth: .infinity, maxHeight: .infinity) .clipped() .edgesIgnoringSafeArea(.all) } } .onAppear { PHPhotoLibrary.requestAuthorization { status in if status == .authorized { loadImage() } else { print("User denied access to photo library") } } } private func loadImage() { isLoading = true if let imageURLString = wallpaper.imageURL, let imageURL = URL(string: imageURLString) { self.imageURL = imageURL if imageURL.scheme == "file" { self.isLoading = false print("Local image URL set: \(imageURL)") } else { fetchDownloadURL(from: imageURLString) { url in self.imageURL = url self.isLoading = false print("Image URL fetched and set: \(String(describing: url))") } } } if let livePhotoVideoURLString = wallpaper.livePhotoVideoURL, let livePhotoVideoURL = URL(string: livePhotoVideoURLString) { self.livePhotoVideoURL = livePhotoVideoURL preloadAndPlayVideo(from: livePhotoVideoURL) } else { self.isLoading = false print("No valid image or video URL") } } private func preloadAndPlayVideo(from url: URL) { self.player = AVPlayer(url: url) let playerViewController = AVPlayerViewController() playerViewController.player = self.player self.playerViewController = playerViewController let playerItem = AVPlayerItem(url: url) playerItem.preferredForwardBufferDuration = 1.0 self.player?.replaceCurrentItem(with: playerItem) ... print("Live Photo Video URL set: \(url)") } private func saveWallpaperToPhotos() { if let imageURL = imageURL, let livePhotoVideoURL = livePhotoVideoURL { saveLivePhotoToPhotos(imageURL: imageURL, videoURL: livePhotoVideoURL) } else if let imageURL = imageURL { saveImageToPhotos(url: imageURL) } } private func saveImageToPhotos(url: URL) { ... } private func saveLivePhotoToPhotos(imageURL: URL, videoURL: URL) { isLoading = true downloadVideo(from: videoURL) { localVideoURL in guard let localVideoURL = localVideoURL else { print("Failed to download video for Live Photo") DispatchQueue.main.async { self.isLoading = false } return } print("Video downloaded to: \(localVideoURL)") self.generateAndSaveLivePhoto(imageURL: imageURL, videoURL: localVideoURL) } } private func generateAndSaveLivePhoto(imageURL: URL, videoURL: URL) { LivePhoto.generate(from: imageURL, videoURL: videoURL, progress: { percent in print("Progress: \(percent)") }, completion: { livePhoto, resources in guard let resources = resources else { print("Failed to generate Live Photo") DispatchQueue.main.async { self.isLoading = false } return } print("Live Photo generated with resources: \(resources)") self.saveLivePhotoToLibrary(resources: resources) }) } private func saveLivePhotoToLibrary(resources: LivePhoto.LivePhotoResources) { LivePhoto.saveToLibrary(resources) { success in DispatchQueue.main.async { if success { self.isImageSaved = true print("Live Photo saved successfully") } else { print("Failed to save Live Photo") } self.isLoading = false } } } private func fetchDownloadURL(from gsURL: String, completion: @escaping (URL?) -> Void) { let storageRef = Storage.storage().reference(forURL: gsURL) storageRef.downloadURL { url, error in if let error = error { print("Failed to fetch image URL: \(error)") completion(nil) } else { completion(url) } } } private func downloadVideo(from url: URL, completion: @escaping (URL?) -> Void) { let task = URLSession.shared.downloadTask(with: url) { localURL, response, error in guard let localURL = localURL, error == nil else { print("Failed to download video: \(String(describing: error))") completion(nil) return } completion(localURL) } task.resume() } }```
1
1
785
Mar ’25
PHLivePhotoEditingContext.saveLivePhoto results in AVFoundation error -11800 "The operation could not be completed" reason An unknown error occurred (-12815)
When trying to edit some Live Photos, calling PHLivePhotoEditingContext.saveLivePhoto results in the following error: Error Domain=AVFoundationErrorDomain Code=-11800 "The operation could not be completed" UserInfo={NSLocalizedFailureReason=An unknown error occurred (-12815), NSLocalizedDescription=The operation could not be completed, NSUnderlyingError=0x300d05380 {Error Domain=NSOSStatusErrorDomain Code=-12815 "(null)"}} I was able to replicate it on my device by taking a new Live Photo. Not sure what's wrong with that one specifically, not all Live Photos replicate the issue. I've submitted FB15880825 with a sysdiagnose and a Photos Diagnostics as well. Any ideas what's going on here? It's impacting multiple customers. Thanks!
1
0
569
Jun ’25
com.Metal.CompletionQueueDispatch crash in Swift 6
I have a photo editing app which uses a simple Metal Render to display CIFilter output images. It works just fine in Swift 5 but in Swift 6 it crashes on starting the Metal command buffer with an error in the Queue : com.Metal.CompletionQueueDispatch (serial). The crash is occurring before I can debug.. I changed the command buffer to report MTLCommandBufferDescriptorStatus errorOptions = .encoderExecutionStatus. No luck with getting insight into the source of the crash.. Likewise the error is happening before any of the usual Metal debug tools are enabled. The Metal render works just fine in Swift 5 and also works fine with almost all of the Swift Compiler Upcoming feature flags set to Yes. [The "Default Internal Imports" flag is still No. (the number of compile errors with this setting is absolutely scary! but that's another topic) Do you have any suggestions on debugging or ideas on why the Metal library is crashing in Swift 6??? Everything is current release versions and hardware.
1
0
639
Dec ’24
Extrinsic matrix
Hi everyone, I am working on a 3D reconstruction project. Recently I have been able to retrieve the intrinsics from the two cameras on the back of my iPhone. One consideration is that I want this app to run regardless if there is no LiDAR, but at least two cameras on the back. IF there is a LiDAR that is something I have considered to work later on the course of the project. I am using a AVCaptureSession with the two cameras AVCaptureDevice: builtInWideAngleCamera builtInUltraWideCamera The intrinsic matrices seem to be correct. However, the when I retrieve the extrinsics, e.g., builtInWideAngleCamera w.r.t. builtInUltraWideCamera the matrix I get looks like this: Extrinsic Matrix (Ultra-Wide to Wide): [0.9999968, 0.0008149305, -0.0023960583, 0.0] [-0.0008256607, 0.9999896, -0.0044807075, 0.0] [0.002392382, 0.0044826716, 0.99998707, 0.0]. [-14.277955, -8.135408e-10, -0.3359985, 0.0] The extrinsic matrix of the form: [R | t], seems to be correct for the rotational part, but the translational vector is ALL ZEROS. Which suggests that the cameras are physically overlapped as well the last element not being 1 (homogeneous coordinates). Has anyone encountered this 'issue' before? Is there a flaw in my reasoning or something I might be missing? Any comments are very much appreciated.
1
0
619
Dec ’24
Message from com.apple.photos.backend (PhotoKit) in log
Hello Our application is backing up the user photos to some back end. When retrieving the asset data from the Photo Library, we set the flag 'accessNetworkAllowed' to true to get the assets that might be optimized in iCloud. In the application logs, we can see the message below, and it shows as coming from com.apple.photos.backend (PhotoKit) Missing prefetched properties for PHAssetAdjustmentProperties on <PHAsset: 0x160b1ec00> BCF5688F-F7A7-4196-AFC7-A84E8BD95F3E/L0/001 mediaType=1/0, sourceType=1, (5601x3734), creationDate=2022-01-24 23:36:05 +0000, location=0, hidden=0, favorite=0, adjusted=0 . Fetching on demand on the main queue, which may degrade performance. In particular, the message says 'Fetching on demand on the main queue' but I'm not sure if that means that PhotoKit will fetch on main queue or if that mean that our application is requesting the data on main queue. Anyone could clarify? thanks
1
0
776
Dec ’24