Photos & Camera

RSS for tag

Explore technical aspects of capturing high-quality photos and videos, including exposure control, focus modes, and RAW capture options.

Posts under Photos & Camera subtopic

Post

Replies

Boosts

Views

Activity

[PHCollectionList isPrivacySensitiveAlbum]: unrecognized selector crash
I cannot find anything documentation re: isPrivacySensitiveAlbum. I've granted my app access to all photos. Not sure what else to try Code that triggers the crash: let options = PHFetchOptions() options.fetchLimit = 1 let assetColl = PHAssetCollection.fetchAssetCollections(withLocalIdentifiers: [localId], options: options) if assetColl.count > 0 { if let asset = PHAsset.fetchKeyAssets(in: assetColl.firstObject!, options: options) stack trace from here on `2023-04-15 06:34:41.628537-0700 DPF[33615:6484880] -[PHCollectionList isPrivacySensitiveAlbum]: unrecognized selector sent to instance 0x7ff09232aec0 2023-04-15 06:34:41.632378-0700 DPF[33615:6484880] *** Terminating app due to uncaught exception 'NSInvalidArgumentException', reason: '-[PHCollectionList isPrivacySensitiveAlbum]: unrecognized selector sent to instance 0x7ff09232aec0' *** First throw call stack: ( 0 CoreFoundation 0x00007ff80045478b __exceptionPreprocess + 242 1 libobjc.A.dylib 0x00007ff80004db73 objc_exception_throw + 48 2 CoreFoundation 0x00007ff8004638c4 +[NSObject(NSObject) instanceMethodSignatureForSelector:] + 0 3 CoreFoundation 0x00007ff800458c66 ___forwarding___ + 1443 4 CoreFoundation 0x00007ff80045ae08 _CF_forwarding_prep_0 + 120 5 Photos 0x00007ff80b8480e1 +[PHAsset fetchKeyAssetsInAssetCollection:options:] + 86 6 DPF 0x0000000100791029 $s3DPF16AlbumListFetcherV22loadKeyImageForLocalIdySo7UIImageCSgSSYaFTY0_ + 569`
2
0
774
Dec ’24
How to access HDRGainMap from AVCapturePhoto
Hey, I'm building a camera app and I want to use the captured HDRGainMap along side the photo to do some processing with a CIFilter chain. How can this be done? I can't find any documentation any where on this, only on how to access the HDRGainMap from an existing HEIC file, which I have done successfully. For this I'm doing something like the following: let gainmap = CGImageSourceCopyAuxiliaryDataInfoAtIndex(source, 0, kCGImageAuxiliaryDataTypeHDRGainMap) let gainDict = NSDictionary(dictionary: gainmap) let gainData = gainDict[kCGImageAuxiliaryDataInfoData] as? Data let gainDescription = gainDict[kCGImageAuxiliaryDataInfoDataDescription] let gainMeta = gainDict[kCGImageAuxiliaryDataInfoMetadata] However I'm not sure what the approach is with a AVCapturePhoto output from a AVCaptureDevice. Thanks!
2
0
632
Jan ’25
WorldCaptureKit
Does the library exists in xCode 16.4? "import WorldCaptureKit" gives error "No such module 'WorldCaptureKit'". And I do not find any information about the library in the apple documentation. But AI keeps suggesting me to use the library
2
0
109
Jun ’25
any app shows a black square instead of the camera picture
Hi! I am making an app for Apple Vision pro (VisionOS 2.5) that is scanning the surroundings and recognises all the texts around you. I tried to use the AVCaptureSession library, but when I run the app from xcode on the real AVP device, the camera is not accessible. I enabled the camera access in my Info.plist: NSCameraUsageDescription Used for live text recognition and I checked camera settings in the AVP, there are no restrictions. However I have always a black square with a crossed camera icon displayed instead of the image from the camera. I tried a couple of different apps from Github using the AVCaptureSession and they all display the black square instead of the picture. What can be wrong with the camera?
2
0
106
Jun ’25
dyld[434]: Library not loaded: error when running LockedCameraCapture compatible app on iOS 15
Hello, I am getting the following error while attempting to run my LockedCameraCapture compatible app on an iOS 15 device: dyld[434]: Library not loaded: '/System/Library/Frameworks/LockedCameraCapture.framework/LockedCameraCapture' Referenced from: '/private/var/containers/Bundle/Application/.../MyApp.app/MyApp.debug.dylib' Reason: tried: '/System/Library/Frameworks/LockedCameraCapture.framework/LockedCameraCapture' (no such file) Of course iOS 15 doesn't have the library for LockedCameraCapture, but I have had no issue including Lock Screen Widgets (which require iOS 16), so I am not sure why the error is popping up. Thank you!
2
0
466
Mar ’25
AVCaptureDevice rotationCoordinator modifying CALayer on switching devices
I am trying to use AVCaptureDevice.rotationCoordinator API to observe angles for preview and capture and it seems there is an issue with the API when used with arbitrary CALayer (which is not a AVCaptureVideoPreviewLayer) and switching cameras. Here is my setup. The below function is defined in an actor class called CameraManager that performs setup of rotationCoordinator. func updateRotationCoordinator(_ callback:@escaping @MainActor (CGFloat) -> Void) { guard let device = sessionConfiguration.activeVideoInput?.device, let displayLayer = displayLayer else { return } cancellables.removeAll() rotationCoordinator = AVCaptureDevice.RotationCoordinator(device: device, previewLayer: displayLayer) guard let coordinator = rotationCoordinator else { return } coordinator.publisher(for: \.videoRotationAngleForHorizonLevelPreview) .receive(on: DispatchQueue.main) .sink { degrees in let radians = degrees * .pi / 180 MainActor.assumeIsolated { callback(radians) } } .store(in: &cancellables) } This works the very first time but when I switch cameras and call this function again, it throws a runtime error that view's layer is modified from a non-main thread. This happens at the very line where rotation coordinator is been recreated. It's not clear why initialising rotation coordinator should modify CALayer properties right in it's init method. Modifying properties of a view's layer off the main thread is not allowed: view <MyApp.DisplayLayerView: 0x102ffaf40> with nearest ancestor view controller <_TtGC7SwiftUI19UIHostingControllerGVS_15ModifiedContentVS_7AnyViewVS_12RootModifier__: 0x101f7fb80>; backtrace: ( 0 UIKitCore 0x0000000194a977b4 575E5140-FA6A-37C2-B00B-A4EACEDFDA53 + 22509492 1 UIKitCore 0x000000019358594c 575E5140-FA6A-37C2-B00B-A4EACEDFDA53 + 416076 2 QuartzCore 0x00000001927f5bd8 D8E8E86D-85AC-3C90-B2E1-940235ECAA18 + 43992 3 QuartzCore 0x00000001927f5a4c D8E8E86D-85AC-3C90-B2E1-940235ECAA18 + 43596 4 QuartzCore 0x000000019283a41c D8E8E86D-85AC-3C90-B2E1-940235ECAA18 + 324636 5 QuartzCore 0x000000019283a0a8 D8E8E86D-85AC-3C90-B2E1-940235ECAA18 + 323752 6 AVFCapture 0x00000001af072a18 09192166-E0B6-346C-B1C2-7C95C3EFF7F7 + 420376 7 MyApp.debug.dylib 0x0000000105fa3914 $s10MyApp15CapturePipelineC25updateRotationCoordinatoryyy12CoreGraphics7CGFloatVScMYccF + 972 8 MyApp.debug.dylib 0x00000001063ade40 $s10MyApp11CameraModelC18switchVideoDevicesyyYaFTY3_ + 72 9 MyApp.debug.dylib 0x0000000105fe3cbd $s10MyApp11ContentViewV4bodyQrvg7SwiftUI6VStackVyAE05TupleE0VyAE6HStackVyAIyAE6SpacerV_AE6ButtonVyAE0E0PAEE5frame5width6height9alignmentQr12CoreGraphics7CGFloatVSg_AyE9AlignmentVtFQOyAqEE11scaledToFitQryFQOyAqEE10imageScaleyQrAE5ImageV0Z0OFQOyA3__Qo__Qo__Qo_GtGG_AmKyAIyAKyAIyAqEE7paddingyQrAE4EdgeO3SetV_AYtFQOyAA07CaptureM0V_Qo__AOyAE4TextVGAmKyAIyA9__AqEEArstUQrAY_AYA_tFQOyAM_Qo_A9_tGGtGG_AmqEE10background_AUQrqd___A_tAePRd__lFQOyAqEEArstUQrAY_AYA_tFQOyA21__Qo__AqEEArstUQrAY_AYA_tFQOyAE06_ShapeE0VyAE9RectangleVAE5ColorVG_Qo_Qo_SgtGGtGGyXEfU0_A42_yXEfU_A10_yXEfU_yyScMYccfU_yyYacfU_TQ1_ + 1 10 MyApp.debug.dylib 0x0000000105ff06d9 $s10MyApp11ContentViewV4bodyQrvg7SwiftUI6VStackVyAE05TupleE0VyAE6HStackVyAIyAE6SpacerV_AE6ButtonVyAE0E0PAEE5frame5width6height9alignmentQr12CoreGraphics7CGFloatVSg_AyE9AlignmentVtFQOyAqEE11scaledToFitQryFQOyAqEE10imageScaleyQrAE5ImageV0Z0OFQOyA3__Qo__Qo__Qo_GtGG_AmKyAIyAKyAIyAqEE7paddingyQrAE4EdgeO3SetV_AYtFQOyAA07CaptureM0V_Qo__AOyAE4TextVGAmKyAIyA9__AqEEArstUQrAY_AYA_tFQOyAM_Qo_A9_tGGtGG_AmqEE10background_AUQrqd___A_tAePRd__lFQOyAqEEArstUQrAY_AYA_tFQOyA21__Qo__AqEEArstUQrAY_AYA_tFQOyAE06_ShapeE0VyAE9RectangleVAE5ColorVG_Qo_Qo_SgtGGtGGyXEfU0_A42_yXEfU_A10_yXEfU_yyScMYccfU_yyYacfU_TATQ0_ + 1 11 MyApp.debug.dylib 0x0000000105f9c595 $sxIeAgHr_xs5Error_pIegHrzo_s8SendableRzs5NeverORs_r0_lTRTQ0_ + 1 12 MyApp.debug.dylib 0x0000000105f9fb3d $sxIeAgHr_xs5Error_pIegHrzo_s8SendableRzs5NeverORs_r0_lTRTATQ0_ + 1 13 libswift_Concurrency.dylib 0x000000019c49fe39 E15CC6EE-9354-3CE5-AF91-F641CA8283E0 + 433721 )
2
0
537
Feb ’25
Why a driverkit extension needs a CMIO extension
I developed a driverkit extension based on overriding-the-default-usb-video-class-extension, but the link didn’t give the details of realization. I asked DTS who gave two tips: 1, Do you also have a CMIO extension to load in place of the default overriding-the-default-usb-video-class-extension 2, Your DriverKit extension’s info.plist is also missing the CameraAssistantBundleID. I want to know why a driverkit extension needs a CMIO extension, what’s the data and control flow?
2
0
89
Jun ’25
Crash in Photos framework
Hi This is one of our top crashes. It does not contain any of our code in the stacktrace and we can't reproduce it. Those points make this crash very hard to understand and fix. We know that most of the crashes are happening on iPhone 13 with iOS 18.x.x. Also we see that a lot of cases happen when app goes into background (stacktrace contains -[UIApplication _applicationDidEnterBackground]). 2025-03-04_16-06-00.3670_-0500-6a273c7d5da97f098b5cc24898bb9761dc45208e.crash 2025-03-04_20-21-08.6609_-0500-2c08f640900f8a62c4f8a4f6f2a61feb052e66dd.crash 2025-03-04_20-46-27.7138_+0000-4d7ea89b1b564eda22ca63e708f7ad3909c7b768.crash
2
0
380
Mar ’25
Only fetch PHAssets from the current user's iCloud Library
I'm working on a photo app and I want to allow the user to display, edit and delete photos. I can fetch all photos using PHAsset.fetchAssets(with: options). This works as intended. However, I can't seem to find a way to prevent the user from seeing photos from a Shared Library. The PHAssetSourceType only contains typeCloudShared to only show items from a specific album; not library. How can I filter by iCloud Shared Library?
2
1
341
1w
iPhone Camera opening unexpectedly
My Camera app is repeatedly opening even though I am not taking any action to open it when I use iPhone. Today during a FaceTime call, the Camera app opened while the phone was unlocked without me touching anything. It didn’t end the FaceTime call, but just put the video on pause for the person I was speaking with. I force closed the Camera app, then it happened again a few minutes later. This has happened while using Google Maps and other apps as well, while the phone is unlocked. This is also happening while the phone is locked, just sitting on a table. All the sudden I look over and the screen is active showing the camera view. Today this has happened at least 20 times. I need to know how to stop it. I am on iOS 18.1 and enrolled in iOS 18 Public Beta. There are no pending software updates.
2
0
1.1k
Nov ’24
PHPickerFilter doesn't always apply in Collections tab when using PHPickerViewController
Hi everyone, I’m running into an issue with PHPickerFilter when using PHPickerViewController. When I configure the picker with a .videos and .livePhotos filter, it seems to work correctly in the Photos tab. However, when I switch to the Collections tab, the filter doesn’t always apply — users can still see and select static image assets in certain collections (e.g. from one of the People & Pets sections). Here’s a simplified snippet of my setup: var configuration = PHPickerConfiguration(photoLibrary: .shared()) configuration.selectionLimit = 1 var filters = [PHPickerFilter]() filters.append(.videos) filters.append(.livePhotos) configuration.filter = PHPickerFilter.any(of: filters) configuration.preferredAssetRepresentationMode = .current let picker = PHPickerViewController(configuration: configuration) picker.delegate = self present(picker, animated: true) Expected behavior: The picker should consistently respect the filter across both Photos and Collections tabs, only showing assets that match the filter. Actual behavior: The filter seems to apply correctly in the Photos tab, but in the Collections tab, other asset types are still visible/selectable. Has anyone else encountered this behavior? Is this expected or a known issue, or am I missing something in the configuration? Thanks in advance!
2
0
312
3d
Unable to Fetch Videos from Recently Deleted Album Using Photos Framework in iOS 18.3.1
Hello everyone, I’m working on an iOS app that fetches videos from the "Recently Deleted" album using the Photos framework in Swift. However, I’m unable to fetch any videos, even though the "Recently Deleted" album contains 233 items (including videos), as seen in the Photos app. Environment: iOS Version: 18.3.1 Xcode Version: 16.2 Swift Version: Swift 5 Device: iPhone (simulator and physical device both tested) Photo Library Permission: "All Photos" access granted Recently Deleted Lock: Face ID/Passcode is disabled for "Recently Deleted"
2
0
101
Mar ’25
Why is my YCbCr to sRGB camera pipeline brighter Than Preview Layer? (420YpCbCr8BiPlanarFullRange, AVCaptureVideoPreviewLayer)
Hi all, I'm working on a custom Metal-based video pipeline using AVCaptureVideoDataOutput, and I've run into an unexpected issue related to exposure. Setup: I'm capturing video frames using kCVPixelFormatType_420YpCbCr8BiPlanarFullRange. In my Metal shader, I: Convert YCbCr (full range Rec.709) to linear Rec.709 RGB. Apply Rec.709 → sRGB gamma encoding. Output to .bgra8Unorm_srgb via MTKView. Everything renders correctly in terms of colorspace math, but the image appears significantly brighter (~+3 stops EV) compared to AVCaptureVideoPreviewLayer and the native iOS Camera app under the same camera exposure settings. What I’ve verified: The color transforms are correct: YCbCr709 to RGB, then linear to sRGB. I'm not applying any tone mapping or aggressive look LUTs yet. Camera exposure is locked using: device.setExposureModeCustom(duration: ..., iso: ...) The same EV (e.g., ISO 50, 1/125s, f/5.6) on my iPhone appears visually 3 stops brighter than on my digital cameras (Sony/Canon etc). To match the look of the preview layer or camera app, I have to simulate a ~–3 EV shift in my custom pipeline. Questions: Is AVCaptureVideoPreviewLayer applying extra tone mapping, digital gain, or contrast shaping (like OOTF etc)? Does the camera ISP expose "hotter" (i.e., with more light) internally for the preview layer than what we get in video frame buffers? Is there a standard way to compensate for this ISP behavior in custom pipelines using AVCaptureVideoDataOutput? Can this be accounted for using metadata (e.g., exposure bias, gain, gamma curve)?
2
0
228
1w
Images with unusual color spaces not correctly loaded by Core Image
Some users reported that their images are not loading correctly in our app. After a lot of debugging we identified the following: This only happens when the app is build for Mac Catalyst. Not on iOS, iPadOS, or “real” macOS (AppKit). The images in question have unusual color spaces. We observed the issue for uRGB and eciRGB v2. Those images are rendered correctly in Photos and Preview on all platforms. When displaying the image inside of a UIImageView or in a SwiftUI Image, they render correctly. The issue only occurs when loading the image via Core Image. When comparing the different Core Image render graphs between AppKit (working) and Catalyst (faulty) builds, they look identical—except for the result. Mac (AppKit): Catalyst: Something seems to be off when Core Image tries to load an image with foreign color space in Catalyst. We identified a workaround: By using a CGImageDestination to transcode the image using the kCGImageDestinationOptimizeColorForSharing option, Image I/O will convert the image to sRGB (or similar) and Core Image is able to load the image correctly. However, one potentially loses fidelity this way. Or might there be a better workaround?
2
3
91
Aug ’25
Connect DSLR Camera issues By Using ImageCaptureCore Framework
Hi Apple Engineer, My App is using ImageCapture Framwork to connect DSLR Camera, Before iOS 18 this method is effective,but When I upgraded my iPhone and iPad, found my app can`t connect DSLR Camera, open Setting -> Privacy & Security -> Files and Folders permission, can‘t found my app, I swear it worked before iOS 18. I find other developers have the same problem. https://forums.developer.apple.com/forums/thread/756960 . https://developer.apple.com/forums/thread/765768. I also found a process for reproducing this problem in ios 18, Do reset all settings. Can you help me with this problem? Or tell me how to use the API properly.Look forward to your reply. Thank you very much.
2
0
453
Oct ’24
why does tripple camera take photo faster than single camera device?
I found this phenomenon, and it can be reproduced stably. If I use a triple-camera to take a photo, if the picture is moving, or I move the phone, let's assume it moves horizontally, when I aim at an object, I press the shutter, which is called time T. At this time, the picture in the viewfinder is T0, and the photo produced is about T+100ms. If I use a single-camera to take a photo, use the same speed to move the phone to move the picture, and press the shutter when aiming at the same object, the photo produced is about T+400ms later. Let me describe the problem I encountered in another way. Suppose a pile of cards are placed horizontally on the table, and the cards are written with numbers from left to right, 0,1,2,3,4,5,6... Now aim the camera at the number 0, and then move to the right at a uniform speed. The numbers pass through the camera's viewfinder and continue to increase. When aiming at the number 5, press the shutter. If it is a triple-camera, the photo obtained will probably show 6, while if it is taken with a single-camera, the photo obtained will be about 9. This means the triple camera can capture photos faster, but why is this the case? Any explanation?
2
0
488
Dec ’24
AVCaptureMetadataOutput .face detection not working on iOS 26 Beta with high sessionPreset
In iOS 26 (Developer Beta), the AVCaptureMetadataOutputObjectsDelegate no longer receives callbacks when metadataOutput.metadataObjectTypes = [.face] is set. On earlier iOS versions the issue does not occur. Interestingly, face detection works if I set the sessionPreset to .medium, but not with .high — except on the iPhone 16 Pro Max, where it works regardless.
2
1
315
2w