Photos & Camera

RSS for tag

Explore technical aspects of capturing high-quality photos and videos, including exposure control, focus modes, and RAW capture options.

Posts under Photos & Camera subtopic

Post

Replies

Boosts

Views

Activity

Unabe to use writeHEIFRepresentation - failed to add image to the PhotoCompressionSession.
I'm getting an error writing a ciImage as a heif image: // Create CIImage directly from pixel buffer let ciImage = CIImage(cvPixelBuffer: pixelBuffer, options: [CIImageOption.properties: combinedMetadata]) // Write HEIC synchronously do { try ciContext.writeHEIFRepresentation(of: ciImage, to: url, format: .RGBA8, colorSpace: colorSpace) The error I'm getting is: Error Domain=CINonLocalizedDescriptionKey Code=3 "(null)" UserInfo={CINonLocalizedDescriptionKey=failed to write HEIC data to file., NSUnderlyingError=0x11b1a1ec0 {Error Domain=CINonLocalizedDescriptionKey Code=10 "(null)" UserInfo={CINonLocalizedDescriptionKey=failed to add image to the PhotoCompressionSession.}}} Both try ciContext.writeJPEGRepresentation(of: copiedCIImage, to: url, colorSpace: colorSpace, options: options) and try ciContext.writePNGRepresentation(of: copiedCIImage, to: url, format: .RGBA8, colorSpace: colorSpace) work. I also verified that the code works with iOS 18. Is there something new we need to do for Heif images? Thanks in advance
6
0
382
2w
After iPadOS 26 Beta and iOS 26 Beta, AVCaptureMetadataOutput no longer detects Face on some devices.
I'm creating an app that uses AVCaptureSession to pass camera input to AVCaptureMetadataOutput type set [metaout setMetadataObjectTypes:@[AVMetadataObjectTypeFace]] and scan Face. After updating to OS 26 Beta2 and iOS 26 Beta2, an issue has occurred where the delegate method of AVCaptureMetadataOutputObjectsDelegate is not called on some devices. The following devices are experiencing this issue. iPad (9th Gen) iPad air (4th Gen) iPhone 15 This issue has not occur on any other devices I have. I tried running the AVFoundation sample code on the Apple Developer site on the above device. The same problem still occurs. [https://developer.apple.com/documentation/avfoundation/capture_setup/avcambarcode_detecting_barcodes_and_faces] Are any additional settings required after OS 26 beta and iOS 26 beta? Or is there some problem on the OS side?
1
6
431
2w
AVCaptureMetadataOutput .face detection not working on iOS 26 Beta with high sessionPreset
In iOS 26 (Developer Beta), the AVCaptureMetadataOutputObjectsDelegate no longer receives callbacks when metadataOutput.metadataObjectTypes = [.face] is set. On earlier iOS versions the issue does not occur. Interestingly, face detection works if I set the sessionPreset to .medium, but not with .high — except on the iPhone 16 Pro Max, where it works regardless.
2
1
315
2w
Blurry Depth Data since iPhone 13
I tested the accuracy of the depth map on iPhone 12, 13, 14, 15, and 16, and found that the variance of the depth map after iPhone 12 is significantly greater than that of iPhone 12. Enabling depth filtering will cause the depth data to be affected by the previous frame, adding more unnecessary noise, especially when the phone is moving. This is not friendly for high-precision reconstruction. I tried to add depth map smoothing in post-processing to solve the problem of large depth map deviation, but the performance is still poor. Is there any depth map smoothing solutions already announced by Apple?
1
0
36
2w
How can I create my own Genlock hardware for the iPhone 17 Pro?
What options do I have if I don't want to use Blackmagic's Camera ProDock as the external Sync Hardware, but instead I want to create my own USB-C hardware accessory which would show up as an AVExternalSyncDevice on the iPhone 17 Pro? Which protocol does my USB-C device have to implement to show up as an eligible clock device in AVExternalSyncDevice.DiscoverySession?
1
0
787
2w
Raycasting VNFaceLandmarkRegion2D
Hello, Does anyone have a recipe on how to raycast VNFaceLandmarkRegion2D points obtained from a frame's capturedImage? More specifically, how to construct the "from" parameter of the frame's raycastQuery from a VNFaceLandmarkRegion2D point? Do the points need to be flipped vertically? Is there any other transformation that needs to be performed on the points prior to passing them to raycastQuery?
4
0
270
3w
How to get a callback once a requested frameDuration change has been applied?
When changing a camera's exposure, AVFoundation provides a callback which offers the timestamp of the first frame captured with the new exposure duration: AVCaptureDevice.setExposureModeCustom(duration:, iso:, completionHandler:). I want to get a similar callback when changing frame duration. After setting AVCaptureDevice.activeVideoMinFrameDuration or AVCaptureDevice.activeVideoMinFrameDuration to a new value, how can I compute the index or the timestamp of the first camera frame which was captured using the newly set frame duration?
0
0
479
Aug ’25
Implementation of Audio-Video Synchronization in Swift
I have a feature requirement: to switch the writer for file writing every 5 minutes, and then quickly merge the last two files. How can I ensure that the merged file is seamlessly combined and that the audio and video information remains synchronized? Currently, the merged video has glitches, and the audio is also out of sync. If there are experts who can provide solutions in this area, I would be extremely grateful.
1
0
197
Aug ’25
Live Photos created with PHLivePhoto API show "Motion not available" when setting as wallpaper
I'm creating Live Photos programmatically in my app using the Photos and AVFoundation frameworks. While the Live Photos work perfectly in the Photos app (long press shows motion), users cannot set them as motion wallpapers. The system shows "Motion not available" message. Here's my approach for creating Live Photos: // 1. Create video with required metadata let writer = try AVAssetWriter(outputURL: videoURL, fileType: .mov) let contentIdentifier = AVMutableMetadataItem() contentIdentifier.identifier = .quickTimeMetadataContentIdentifier contentIdentifier.value = assetIdentifier as NSString writer.metadata = [contentIdentifier] // Video settings: 882x1920, H.264, 30fps, 2 seconds // Added still-image-time metadata at middle frame // 2. Create HEIC image with asset identifier var makerAppleDict: [String: Any] = [:] makerAppleDict["17"] = assetIdentifier // Required key for Live Photo metadata[kCGImagePropertyMakerAppleDictionary as String] = makerAppleDict // 3. Generate Live Photo PHLivePhoto.request( withResourceFileURLs: [photoURL, videoURL], placeholderImage: nil, targetSize: .zero, contentMode: .aspectFit ) { livePhoto, info in // Success - Live Photo created } // 4. Save to Photos library PHAssetCreationRequest.forAsset().addResource(with: .photo, fileURL: photoURL, options: nil) PHAssetCreationRequest.forAsset().addResource(with: .pairedVideo, fileURL: videoURL, options: nil) What I've Tried Matching exact video specifications from Camera app (882x1920, H.264, 30fps) Adding all documented metadata (content identifier, still-image-time) Testing various video durations (1.5s, 2s, 3s) Different image formats (HEIC, JPEG) Comparing with exiftool against working Live Photos Expected Behavior Live Photos created programmatically should be eligible for motion wallpapers, just like those from the Camera app. Actual Behavior System shows "Motion not available" and only allows setting as static wallpaper. Any insights or workarounds would be greatly appreciated. This is affecting our users who want to use their created content as wallpapers. Questions Are there additional undocumented requirements for Live Photos to be wallpaper-eligible? Is this a deliberate restriction for third-party apps, or a bug? Has anyone successfully created Live Photos that work as motion wallpapers? Environment iOS 17.0 - 18.1 Xcode 16.0 Tested on iPhone 16 Pro
1
1
229
Aug ’25
Images with unusual color spaces not correctly loaded by Core Image
Some users reported that their images are not loading correctly in our app. After a lot of debugging we identified the following: This only happens when the app is build for Mac Catalyst. Not on iOS, iPadOS, or “real” macOS (AppKit). The images in question have unusual color spaces. We observed the issue for uRGB and eciRGB v2. Those images are rendered correctly in Photos and Preview on all platforms. When displaying the image inside of a UIImageView or in a SwiftUI Image, they render correctly. The issue only occurs when loading the image via Core Image. When comparing the different Core Image render graphs between AppKit (working) and Catalyst (faulty) builds, they look identical—except for the result. Mac (AppKit): Catalyst: Something seems to be off when Core Image tries to load an image with foreign color space in Catalyst. We identified a workaround: By using a CGImageDestination to transcode the image using the kCGImageDestinationOptimizeColorForSharing option, Image I/O will convert the image to sRGB (or similar) and Core Image is able to load the image correctly. However, one potentially loses fidelity this way. Or might there be a better workaround?
2
3
91
Aug ’25
Optimizing UICollectionView Scrolling Performance and High-Quality Image Loading with PHCachingImageManager
Hello, I'm developing an app that displays a photo library using UICollectionView and PHCachingImageManager. I'd like to achieve a user experience similar to the native iOS Photos app, where low-quality images are shown quickly while scrolling, and higher-quality images are loaded for visible cells once scrolling stops. I'm currently using the following approach: While Scrolling: I'm using the UICollectionViewDataSourcePrefetching protocol. In the prefetchItemsAt method, I call startCachingImages with low-quality options to cache images in advance. After Scrolling Stops: In the scrollViewDidEndDecelerating method, I intend to load high-quality images for the currently visible cells. I have a few questions regarding this approach: What is the best practice for managing both low-quality and high-quality images efficiently with PHCachingImageManager? Is it correct to call startCachingImages with fastFormat options and then call it again with highQualityFormat in scrollViewDidEndDecelerating? How can I minimize the delay when a low-quality image is replaced by a high-quality one? Are there any additional strategies to help pre-load high-quality images more effectively?
0
1
234
Aug ’25
How to Keep Camera Running in iOS PiP Mode (Like WhatsApp/Google Meet)?
I'm using Picture-in-Picture (PiP) mode in my native iOS application, which is similar to Google Meet, using the VideoSDK.live Native iOS SDK. The SDK has built-in support for PiP and it's working fine for the most part. However, I'm running into a problem: When the app enters PiP mode, the local camera (self-video) of the participant freezes or stops. I want to fix this and achieve the same smooth behavior seen in apps like Google Meet and WhatsApp, where the local camera keeps working even in PiP mode. I've been trying to find documentation or examples on how to achieve this but haven't had much luck. I came across a few mentions that using the camera in the background may require special entitlements from Apple (like in the entitlements file). Most of the official documentation says background camera access is restricted due to Apple’s privacy policies. So my questions are: Has anyone here successfully implemented background camera access in PiP mode on iOS? If yes, what permissions or entitlements are required? How do apps like WhatsApp and Google Meet achieve this functionality on iPhones? Any help, advice, or pointers would be really appreciated!
0
0
194
Aug ’25
How to reliably detect user-modified photos?
I'm developing a photo backup app. To detect newly added or edited photos since the app launched, I keep a local dictionary in the format [localIdentifier: modification_date]. However, PHAsset.modificationDate is not reliable. It often changes unexpectedly, possibly due to system operations like iCloud metadata updates. Is there a more reliable way to detect whether a photo has been modified by user since the last app launch? I'm thinking about using content hash instead, but I'm not sure how heavy this operation is in terms of performance.
2
0
53
Aug ’25
Is a Locked Capture Extension allowed to just "open the app" when the device is unlocked?
Hey, Quick question. I noticed that Adobe's new app, Project Indigo, allows you to open the app using the Camera Control button. However, when your device is locked it just shows this screen: Would this normally be approved by the Appstore approval process? I ask because I would like to do something similar with my camera app. I know that this is not the best user experience, but my apps UI is not built in Swift and I don't have the resources to build the UI again. At least this way the user experience would be improved from what it is now, where users cannot even launch the app. I get many requests per week about this feature and would love to improve the UX for my users, even if it's not the best possible. Thanks, Alex
1
0
271
Jul ’25
Does CMIO support "hide" build-in camera
Hi guys, Can I use CMIO to achieve the following feature on macOS when a USB device (Camera/Mic/Speaker) is connected: When a third-party video conferencing app is not in a meeting, ensure the app defaults to using the USB device (Camera/Mic/Speaker). When a third-party conferencing app is in a meeting, ensure the app automatically switches to the USB device (Camera/Mic/Speaker).
2
0
133
Jul ’25
AVCaptureSession startRunning is slow
AVCaptureSession's startRunning method is thread blocking and seems to be slow. What is this method doing behind the scenes? For context: I'm working on Simulator Camera support and I have a 'fake' AVCaptureDevice that might be causing this. My hypothesis is that AVCaptureSession tries to connect to the device and waits for a notification to be posted back. I'd love to find a way to let my fake device message AVCaptureSession that it's connected.
3
0
215
Jul ’25
I cannot acquire entitlement named com.apple.developer.avfoundation.video-data-output-prepares-cellular-radio-for-machine-readable-code-scanning.
AVCaptureVideoDataOutput.preparesCellularRadioForNetworkConnection requires com.apple.developer.avfoundation.video-data-output-prepares-cellular-radio-for-machine-readable-code-scanning. But I cannot acquire its entitlement. I can't find its entitlement on 'Certificates, Identifiers & Profiles'. Any solutions? Provisioning profile "iOS Team Provisioning Profile: ......" doesn't include the com.apple.developer.avfoundation.video-data-output-prepares-cellular-radio-for-machine-readable-code-scanning entitlement.
2
0
497
Jul ’25