Photos & Camera

RSS for tag

Explore technical aspects of capturing high-quality photos and videos, including exposure control, focus modes, and RAW capture options.

Posts under Photos & Camera subtopic

Post

Replies

Boosts

Views

Activity

Orientation does not work on iPhone 17 and above.
I'm receiving output from avcapturesession and capturing an image using Vision, but the image is output in landscape orientation instead of portrait. Even when I set the orientation to up in ciimage, cgimage, and uiimage, the image is still output in landscape orientation. On iPhones 16 and below, the image is output in portrait orientation. But on iPhones 17 and above, the image is output in landscape orientation. Please help.
1
1
349
Dec ’25
Background Upload Extension
Hello, We are trying to use the new Background Upload Extension to improve uploads of assets (Photos, Live Photos, Videos) in the background in our application. 1-The assets have finished uploading, but I'm unable to retrieve successful records using PHAssetResourceUploadJob.fetchJobs(action: .acknowledge, options: nil). When will the successful records be returned? 2-How to retrieve the system's pending tasks ? We want to cancel tasks handed over to the system when returning to the main app to avoid duplicate resource uploads. 3-When we set UploadJobExtensionEnabled = true, will tasks handed over to the system still execute after returning to the main app? Do we need to set UploadJobExtensionEnabled = false upon returning to the main app? If we set UploadJobExtensionEnabled = false, will previously submitted upload tasks be cleared?
0
0
107
Dec ’25
Background Upload Extension For iOS 26.1
Hello, We are trying to use the new Background Upload Extension to improve uploads of assets (Photos, Live Photos, Videos) in the background in our application. 1-The assets have finished uploading, but I'm unable to retrieve successful records using PHAssetResourceUploadJob.fetchJobs(action: .acknowledge, options: nil). When will the successful records be returned? 2-How to retrieve the system's pending tasks? We want to cancel tasks handed over to the system when returning to the main app to avoid duplicate resource uploads. 3-When we set UploadJobExtensionEnabled = true, will tasks handed over to the system still execute after returning to the main app? Do we need to set UploadJobExtensionEnabled = false upon returning to the main app? If we set UploadJobExtensionEnabled = false, will previously submitted upload tasks be cleared?
0
0
65
Dec ’25
AVExternalStorageDeviceDiscoverySession.isSupported returns true only on iPhone Pro models
I am developing an iOS camera app that can record video directly to external storage connected to an iPhone. To detect whether an external USB storage device is connected and to obtain its URL, I am considering using AVExternalStorageDeviceDiscoverySession. However, when checking support using AVExternalStorageDeviceDiscoverySession.isSupported, I observe that it returns true only on Pro model iPhones, and false on non-Pro models in my environment. I have reviewed Apple’s official documentation, but I could not find any clear description of the supported devices or requirements (for example, whether this API is limited to Pro models or requires specific hardware capabilities). I would appreciate any information regarding the following points: ●The actual requirements for AVExternalStorageDeviceDiscoverySession to be supported Device limitations (Pro vs non-Pro models) Hardware requirements (USB controller, external recording capability, etc.) iOS version dependencies ●Whether support for non-Pro models is planned in the future Tested environments iPhone 16 Pro (iOS 18.7.1) → isSupported == true iPhone 16e (iOS 26.2) → isSupported == false iPhone 17 (iOS 26.2) → isSupported == false iPhone Air (iOS 26.2) → isSupported == false If anyone has observed similar behavior or has official information from Apple regarding this API, I would greatly appreciate your insights.
0
0
108
Dec ’25
AVExternalStorageDeviceDiscoverySession.isSupported returns true only on iPhone Pro models
I am developing an iOS camera app that can record video directly to external storage connected to an iPhone. To detect whether an external USB storage device is connected and to obtain its URL, I am considering using AVExternalStorageDeviceDiscoverySession. However, when checking support using AVExternalStorageDeviceDiscoverySession.isSupported, I observe that it returns true only on Pro model iPhones, and false on non-Pro models in my environment. I have reviewed Apple’s official documentation, but I could not find any clear description of the supported devices or requirements (for example, whether this API is limited to Pro models or requires specific hardware capabilities). I would appreciate any information regarding the following points: ①The actual requirements for AVExternalStorageDeviceDiscoverySession to be supported Device limitations (Pro vs non-Pro models) Hardware requirements (USB controller, external recording capability, etc.) iOS version dependencies ②Whether support for non-Pro models is planned in the future Tested environments iPhone 16 Pro (iOS 18.7.1) → isSupported == true iPhone 16e (iOS 26.2) → isSupported == false iPhone 17 (iOS 26.2) → isSupported == false iPhone Air (iOS 26.2) → isSupported == false If anyone has observed similar behavior or has official information from Apple regarding this API, I would greatly appreciate your insights.
0
0
117
Dec ’25
Recurring FigXPCUtilities / FigCaptureSourceRemote err=-17281 logs when using AVCaptureVideoDataOutput on iOS 26.x
Hi everyone, I’m seeing recurring internal AVFoundation camera logs on iOS 26.2 and I’m trying to understand whether this is expected behavior or a regression in the capture pipeline. These logs appear shortly after starting an AVCaptureSession, while video frames are being delivered, and also when the camera is stopped or the capture session is torn down. <<<< FigXPCUtilities >>>> signalled err=-17281 at <>:302 <<<< FigCaptureSourceRemote >>>> Fig assert: "err == 0 " at bail (FigCaptureSourceRemote.m:569) - (err=-17281) Even in this clean, minimal setup, the same logs appear on iOS 26.2 The exact same logic did not produce these logs on iOS 18.x. To rule out issues caused by my own code, GPT created a minimal SwiftUI example from scratch. My primary interest is to perform real-time processing on the video frames delivered by the camera (via AVCaptureVideoDataOutput), for tasks such as analysis, computer vision, or custom frame handling, while simultaneously displaying the live preview. Thanks in advance for any insight. Example Code
0
0
357
Dec ’25
Photos & Imaging Régression - Photos app folders no longer function as folders in iOS 26
I’m writing to report a serious usability regression in the iOS 26 Photos app. Folders can still be created and albums can still be assigned to them, but folders can no longer be opened to view the albums they contain. A container that cannot be opened is not a container, and this breaks a fundamental information architecture model that has existed in Photos for well over a decade. This change disproportionately harms users who maintain large, intentional photo libraries—travel archives, projects, professional work, or long-term personal documentation—where hierarchy and ordering are essential. Search and automated surfacing are not substitutes for deliberate structure. Removing the ability to browse folder → album hierarchy on iOS strips users of control while still exposing the UI for folder creation, which is internally inconsistent. If this behavior is intentional, it should be clearly documented and the folder UI removed to avoid misleading users. If it is not intentional, it needs urgent correction. At minimum, iOS should retain parity with macOS Photos for basic navigation of folders and albums. This is not a niche request; it is a regression in core functionality.
1
0
581
3w
CIRAWFilter.outputImage first-time cost is huge (~3s), subsequent calls are ~3ms. Any official way to pre-initialize RAW pipeline (without taking a real photo)?
Hi Apple Developer Forums, I’m developing an iOS camera app that processes RAW captures using Core Image. I’m seeing a large “first use” performance penalty specifically when creating the CIImage from CIRAWFilter.outputImage. What’s slow (important detail) I’m measuring the time for: let rawFilter = CIRAWFilter(imageData: rawData, identifierHint: hint) let ciImage = rawFilter.outputImage This is not CIContext.render(...) / createCGImage(...). It’s just the time to access outputImage (i.e., building the Core Image graph / RAW pipeline setup). Observed behavior First time accessing CIRAWFilter.outputImage: ~3 seconds Second time (same app session, similar RAW): ~3 milliseconds So something heavy is happening only on first use (decoder initialization, pipeline setup, shader/library compilation, caching, etc.). Using Metal System Trace, I also noticed that during the slow first call there are many “Create MTLLibrary” events, while the second call doesn’t show this pattern. Warm-up attempts using bundled DNG I tried to “warm up” early (e.g., on camera screen entry) by loading a bundled DNG and then accessing CIRAWFilter.outputImage by taking a photo: Warm-up with a ~247 KB DNG → first real RAW outputImage cost drops to ~1.42s Warm-up with a ~25 MB DNG → first real RAW outputImage cost drops to ~843ms This helps, but it’s still far from the steady-state ~3ms. Warm-up by capturing a real RAW (works, but concerns) The only method that fully eliminates the delay is to trigger a real RAW capture programmatically before the user’s first photo, then use that captured rawData to warm up the CIRAWFilter.outputImage path. This brings the first user-facing capture close to the steady-state timing. However: In some regions, the camera shutter sound cannot be suppressed, so “hidden warm-up capture” is unacceptable UX. I’m also unsure whether triggering a real capture without an explicit user action could raise compliance/privacy concerns, even if the image is immediately discarded and never saved/uploaded. Questions Is the large first-time cost of CIRAWFilter.outputImage expected (RAW pipeline initialization / shader compilation)? Is there an Apple-recommended way to pre-initialize the Core Image RAW pipeline / Metal resources so the first outputImage is fast, without taking a real photo? Are there any best practices (e.g. CIContext creation timing, prepareRender(...), specific options) that reliably reduce this first-use overhead for CIRAWFilter? Attachments Figure 1: First RAW capture with no warm-up (~3s outputImage time) Figure 2: First RAW capture after warm-up with bundled DNG (improved but still hundreds of ms) Thanks for any guidance or experience sharing!
3
2
525
2w
AVCaptureDevice.isFaceDrivenAutoFocusEnabled not working with continuousAutoFocus
Hi everyone, I'm developing a camera application that requires precise, predictable control over the focus system. I'm encountering unexpected behavior with face-driven autofocus in continuous autofocus mode. Issue: When using AVCaptureDevice.FocusMode.continuousAutoFocus, the system continues to prioritize faces for focus even after attempting to disable face-driven autofocus with: device.automaticallyAdjustsFaceDrivenAutoFocusEnabled = false device.isFaceDrivenAutoFocusEnabled = false Observations: The behavior is inconsistent across different scenes. In well-lit/properly exposed scenes: focus persistently locks onto faces, ignoring my configuration. . In underexposed scenes: the intended focus behavior is more consistently respected.
0
0
291
2w
ProRAW: Demystify 48MP vs 12MP binning based on lighting?
Hi everyone, does anybody have any resources I could check out regarding the 48->12mp binning behavior on supported sensors? I know the 48mp sensor on iPhone can automatically bin pixels for better low light performance. But not sure how to reliably make this happen in practice. On iPhone 14 Pro+ with a 48MP sensor, I want the best of both worlds for ProRAW: ∙ Bright light: 48MP full resolution ∙ Low light: 12MP pixel-binned for better noise `photoOutput.maxPhotoDimensions = CMVideoDimensions(width: 8064, height: 6048) let settings = AVCapturePhotoSettings(rawPixelFormatType: proRawFormat, processedFormat: [...]) settings.photoQualityPrioritization = .quality // NOT setting settings.maxPhotoDimensions — always get 12MP` When I omit maxPhotoDimensions, iOS always returns 12MP regardless of lighting. When I set it to 48MP, I always get 48MP. Is there an API to let iOS automatically choose the optimal resolution based on conditions, or should I detect low light myself (via device.iso / exposureDuration) and set maxPhotoDimensions accordingly? Any help or direction would be much appreciated!
0
0
249
2w
VisionKit - crash after photo taken in VNDocumentCameraViewController in iOS 26 when Liquid Glass enabled
I'm adopting Liquid Glass in iOS 26, when I try to test VNDocumentCameraViewController with document scanning after Liquid Glass enabled, there's a crash just after a photo is taken in VNDocumentCameraViewController, here's the screenshot when it crashed The exception output in XCode console is this: *** Terminating app due to uncaught exception 'NSInternalInconsistencyException', reason: 'Layout requested for visible navigation bar, <UINavigationBar: 0x1240bde00; frame = (0 117; 390 54); opaque = NO; tintColor = UIExtendedSRGBColorSpace 1 1 0 1; layer = <CALayer: 0x120c21e60>> standardAppearance=0x12407b900 scrollEdgeAppearance=0x12407bb80 compactAppearance=0x12407b880 no-scroll-edge-support, when the top item belongs to a different navigation bar. topItem = <UINavigationItem: 0x1240bd800> style=navigator leftBarButtonItems=0x123d4e5f0 rightBarButtonItems=0x123d4d5a0, navigation bar = <UINavigationBar: 0x107b9ad00; frame = (0 47; 390 54); opaque = NO; autoresize = W; tintColor = UIExtendedSRGBColorSpace 1 1 0 1; layer = <CALayer: 0x120c20150>> delegate=0x10a805200 standardAppearance=0x107b2c300 scrollEdgeAppearance=0x107b2c280 compactAppearance=0x107b2c100, possibly from a client attempt to nest wrapped navigation controllers.' *** First throw call stack: (0x18e1db994 0x18b0f5814 0x18c092aa0 0x193b18660 0x193a7d540 0x193a7e020 0x1953ec4a0 0x1943b7d78 0x18ed83420 0x18ed82f74 0x18eb83134 0x18eb44c10 0x18eb70bc4 0x18eb7e74c 0x193ac8cd0 0x193ac8c04 0x193ad6afc 0x193ad5f8c 0x27b456560 0x18e12c4cc 0x18e15c0b0 0x18e15bfd8 0x18e133c1c 0x18e132a6c 0x22ed54498 0x193af6ba4 0x193a9fa78 0x193bcb68c 0x102cc2718 0x102cc2688 0x102cc2794 0x18b14ae28) libc++abi: terminating due to uncaught exception of type NSException
1
1
360
2w
Camera Permissions Popup
We have a very strange issue that I am trying to solve or find the best practice for. We have a SwiftUI View that uses the Camera to preview. So as suggested in Apples Docs we check authorisation status and then if it's not determined we request authorisation. We also have the privacy entry in the info.plist case .notDetermined: AVCaptureDevice.requestAccess(for: .video) { accessStatusAuthorised in if !accessStatusAuthorised { self.cameraStatus = .notAuthorised } else { self.isAuthorized = true self.cameraStatus = .authorised self.startCameraSession(cameraPosition: cameraPosition) } } case .restricted: cameraStatus = .notAuthorised isAuthorized = false case .denied: cameraStatus = .notAuthorised isAuthorized = false case .authorized: cameraStatus = .authorised isAuthorized = true startCameraSession(cameraPosition: cameraPosition) break @unknown default: isAuthorized = true cameraStatus = .notAuthorised } However when we call this code it freezes the Camera feed, even when allow has been tapped. However and this is the confusing part. If we do not call the code above, we still get the permission for camera access pop up and the camera works fine after allowing. What im concerned about is changing the code to do this and its a possible apple bug that gets fixed and hey then none of the Apps allow the camera function. I cannot see any where that the process has changed for iOS 26 / Xcode 26. Can anyone shed any light on this or had similar experience ?
1
0
57
1w
Any way to trigger cameraLensSmudgeDetectionStatus to change?
Looking to implement to UI to tell the user to clean their lens in our app. Implemented the KVO for the cameraLensSmudgeDetectionStatus but I'm having issues reliably triggering it in, both in our app and the main camera app. Tried to get inventive by putting tupperware over the lens, but I think the model driving this or the LiDAR sensor might be smart enough to detect there is something close to the lens. Is there any way to trigger this change in a similar way we can trigger thermal changes in debug? Thanks.
1
0
57
1w
iOS 26: Underexposed image in exposure bracket appears clamped vs single capture
Hello, I have an iOS camera app that captures exposure brackets and performs custom HDR processing. On iOS 26, I’m observing a visual difference between: a single photo captured at –2 EV, and the –2 EV frame from an exposure bracket (–2 / 0 / +2 EV). On iOS 26: The single –2 EV image looks natural and consistent. The –2 EV image from the bracket appears clamped / distorted, most noticeably in high dynamic range scenes (highlight compression and loss of detail). On iOS 18, both approaches produce visually identical and correct –2 EV images. The issue only appears for bracketed captures on iOS 26. Attachments (examples) iOS 26 Single capture –2 EV (JPEG): /Users/danilobudimir/Downloads/ios26SingleImage/JPEG image-4006-8B77-51-0.jpeg Single capture –2 EV — Capture report (dumped settings): /Users/danilobudimir/Downloads/ios26SingleImage/UnderExposureDebug_CaptureReport_2026-01-09T15-59-20Z.md Bracket capture –2 EV frame (JPEG): /Users/danilobudimir/Downloads/bracket_iOS26/JPEG image-45CE-9793-A5-0.jpeg Bracket capture — Capture report (dumped settings): /Users/danilobudimir/Downloads/bracket_iOS26/UnderExposureDebug_CaptureReport_2026-01-09T15-55-42Z.md iOS 18 Single capture –2 EV (JPEG): /Users/danilobudimir/Downloads/ios18SingleImage/JPEG image-47FD-AF73-28-0.jpeg Single capture –2 EV — Capture report: /Users/danilobudimir/Downloads/ios18SingleImage/UnderExposureDebug_CaptureReport_2026-01-09T16-25-27Z.md Bracket capture — –2 EV frame (JPEG): /Users/danilobudimir/Downloads/bracket_iOS18/JPEG image-4A4C-9E93-46-0.jpeg Bracket capture — Capture report: /Users/danilobudimir/Downloads/bracket_iOS18/UnderExposureDebug_CaptureReport_2026-01-09T16-27-23Z.md Question Is there any new behavior in iOS 26 AVFoundation related to: AVCapturePhotoBracketSettings, tone mapping / HDR preprocessing, or internal image processing applied specifically to bracketed frames? Is there a new flag, format requirement or opt-out mechanism required to preserve linear underexposed frames in exposure brackets?
2
0
736
1w
AVCaptureSession preview briefly goes blank after interruption (lock/unlock or camera switch) while isRunning == true
Environment Device: iPhone 15 Pro iOS: iOS 18.0 Framework: AVFoundation App type: Custom camera app using AVCaptureSession + AVCaptureVideoPreviewLayer I’m seeing an intermittent but frequent issue where the camera preview layer briefly flashes empty after certain interruptions, even though the capture session reports itself as running and no errors are emitted. This happens most often after: Locking and unlocking the device Switching cameras (back ↔ front) The issue is not 100% reproducible, but occurs often enough to be noticeable in normal usage. What happens The preview layer briefly flashes as empty (sometimes just a “micro-frame”) Duration: typically ~0.5–2 seconds before frames resume session.isRunning == true throughout No crash, no runtime error, no interruption end failure Focus/exposure restore correctly once frames resume Visually it looks like the preview layer loses frames temporarily, even though the session appears healthy. Repro Intermittent but frequent after: Lock → unlock device Switching camera (front/back) Timing-dependent and non-deterministic Happens multiple times per session, but not every time Key observation AVCaptureSession.isRunning == true does not guarantee that frames are actually flowing. To verify this, I added an AVCaptureVideoDataOutput temporarily: During the blank period, no sample buffers are delivered Frames resume after ~1–2s without any explicit restart Session state remains “running” the entire time What I’ve tried (did NOT fix it) Adding delays before/after startRunning() (0.1–0.5s) Calling startRunning() on different queues Restarting the session in AVCaptureSessionInterruptionEnded Verifying session.connections (all show isActive == true) Rebuilding inputs/outputs during interruption recovery Ensuring startRunning() is never called between beginConfiguration() / commitConfiguration() (Hit the expected runtime warning when attempted) None of the above removed the brief blank preview. Workaround (works visually but expensive) This visually fixes the issue, but: Energy impact jumps from Low → High in Xcode Energy Gauge AVCaptureVideoDataOutput processes 30–60 FPS continuously The gap only lasts ~1–2s, but toggling the delegate on/off cleanly is difficult Overall CPU and energy cost is not acceptable for production Additional notes CPU usage is already relatively high even without the workaround (this app is camera-heavy by nature) With the workaround enabled, energy impact becomes noticeably worse The issue feels like a timing/state desync between session state and actual frame delivery, not a UI issue Questions Is this a known behavior where AVCaptureSession.isRunning == true but frames are temporarily unavailable after interruptions? Is there a recommended way to detect actual frame flow resumption (not just session state)? Should the AVCaptureVideoPreviewLayer.connection (isActive / isEnabled) be explicitly checked or reset after interruptions? Is there a lightweight, energy-efficient way to bridge this short “no frames” gap without using AVCaptureVideoDataOutput? Is rebuilding the entire session the only reliable solution here, or is there a better pattern Apple recommends?
1
0
736
1w
How To Integrating iOS 26 Spatial Photos Feature into Third-Party iPhone App
Dear Apple Technical Support Team, Greetings! I am an iOS app developer, currently upgrading the functions of the photo app I developed Recently, I noticed the new Spatial Photos feature added in the iOS 26 system, which brings an immersive 3D photo experience to users. We hope to integrate similar capabilities into our own app to provide users with a richer photo viewing experience. Through technical research, we found that on Apple Vision devices, the similar spatial photo display effect can be achieved through the ImagePresentationComponent.Spatial3DImage interface. However, our tests show that this interface only supports visionOS and cannot be called in the iOS system. At present, iOS 26 already natively supports the Spatial Photos feature, and we hope to know how to enable third-party photo apps to also have this capability. Here, we sincerely request your team to provide relevant technical support, mainly to understand the following questions: Are there any official APIs, SDKs, or development frameworks applicable to the iOS 26 system that can support third-party apps to implement core functions such as the generation and display of spatial photos? If there are no public adaptive interfaces available at present, are there any other compliant technical solutions or alternative paths to achieve similar effects? For third-party apps to integrate the spatial photo feature, are there any relevant development documents, technical specifications, or review requirements that need to be followed? We have completed the basic function iteration of the app and have the technical capability to quickly adapt to new functions. We hope to receive guidance and support from your team to help us bring a better product experience to iOS users. Attached are the relevant information of our app and the detailed report on interface compatibility during the test for your reference. If you need any further supplementary information, please feel free to inform us. Thank you for reviewing this email in your busy schedule, and we look forward to your reply!
0
0
156
6d
PhotoKit Background Upload Extension not working on iOS 26.2 iPhone 17 Simulator
Hi, I’m trying to implement the new PhotoKit PHBackgroundResourceUploadExtension. I created the extension, enabled full photo library access in the host app, and registered the extension point using the string: com.apple.photos.background-upload. However, when I attempted to enable the extension with: try library.setUploadJobExtensionEnabled(true) I received the following error: Error Domain=PHPhotosErrorDomain Code=-1 "(null)" This happens when running the app on Xcode 26.1 and 26.2 Beta, using the iPhone 17 Pro Max simulator (iOS 26.1 and 26.2). My question is: Is this extension supported on the simulator? I’m asking because at the moment it’s difficult for me to test this on a physical device. Also, What's the meaning of the error? Thanks.
1
1
561
4d
Are White Balance gains applied before or after ADC?
At which point in the image processing pipeline does iOS apply the white balance gains which can be set via AVCaptureDevice.setWhiteBalanceModeLocked(with:completionHandler:)? Are those gains applied in the analog part of the camera pipeline, before the pixel voltage gets converted via the ADC to digital values? Or does the camera first convert the pixel voltages to digital values and then the gains are applied to the digital values? Is this consistent across devices or can the behavior vary from device to device?
0
0
176
3d
Unable to Capture 24MP Photos
Hello, I'm wondering how to capture 24MP photos. I'm currently testing on an iPhone 16 Pro Max. By default, the device's activeFormat supports 24MP (photo dimensions: {4032x3024, 5712x4284}). For the photoOutput, I'm setting the maxPhotoDimensions to videoDevice.activeFormat.supportedMaxPhotoDimensions.lastObject, and setting MaxPhotoQualityPrioritization to quality. When capturing, I'm applying the same maxPhotoDimensions and photoQualityPrioritization settings from the photoOutput directly to the AVCapturePhotoSettings. What could be the issue? // Objective-C // setup [self.photoOutput setMaxPhotoQualityPrioritization:AVCapturePhotoQualityPrioritizationQuality]; CMVideoDimensions maxPhotoDimensions = [(NSValue *)videoDevice.activeFormat.supportedMaxPhotoDimensions.lastObject CMVideoDimensionsValue]; [self.photoOutput setMaxPhotoDimensions:maxPhotoDimensions]; // capturing AVCapturePhotoSettings *photoSettings = [AVCapturePhotoSettings photoSettings]; photoSettings.maxPhotoDimensions = self.photoOutput.maxPhotoDimensions; photoSettings.photoQualityPrioritization = self.photoOutput.maxPhotoQualityPrioritization; [self.photoOutput capturePhotoWithSettings:photoSettings delegate:photoCaptureDelegate]; ...
2
0
737
1d