Photos & Camera

RSS for tag

Explore technical aspects of capturing high-quality photos and videos, including exposure control, focus modes, and RAW capture options.

Post

Replies

Boosts

Views

Activity

How to get the actual distance of the depth map image subject from the true depth camera
I was able to obtain the depth map image using AVCapturePhotoOutput from the delegate method func photoOutput(_ output: AVCapturePhotoOutput, didFinishProcessingPhoto photo: AVCapturePhoto, error: (any Error)?) I convert the depth map to kCVPixelFormatType_DepthFloat32 format and get the pixel values of the depth map using the below code func convertDepthData(depthMap: CVPixelBuffer) -> [[Float32]] { let width = CVPixelBufferGetWidth(depthMap) let height = CVPixelBufferGetHeight(depthMap) var convertedDepthMap: [[Float32]] = Array( repeating: Array(repeating: 0, count: width), count: height ) CVPixelBufferLockBaseAddress(depthMap, CVPixelBufferLockFlags(rawValue: 2)) let floatBuffer = unsafeBitCast( CVPixelBufferGetBaseAddress(depthMap), to: UnsafeMutablePointer<Float32>.self ) for row in 0 ..< height { for col in 0 ..< width { if floatBuffer[width * row + col].isFinite{ convertedDepthMap[row][col] = floatBuffer[width * row + col] } } } CVPixelBufferUnlockBaseAddress(depthMap, CVPixelBufferLockFlags(rawValue: 2)) return convertedDepthMap } Is this the right way of accessing the depth float values from a depth map. And what will be the unit for it. Because some times the depth values are in range of 0.7 when I keep the device close to the subject around 15 to 30 cm.
1
0
53
22h
Depth map is always in hdis format instead of hdep. Unable to capture depth map in kCVPixelFormatType_DepthFloat format even after setting the activeDepthDataFormat for AVCapture device
I'm trying to capture the depth map image using true depth camera in iPhone 15 plus. I was able to setup the AVCapture session with AVCaptureDeviceInput as builtInTrueDepthCamera and AVCapturePhotoOutput with isDepthDataDeliveryEnabled set as true. I also manually made the activeDepthDataFormat of AVCapture device to kCVPixelFormatType_DepthFloat16 or kCVPixelFormatType_DepthFloat32. Finally I have enabled isDepthDataDeliveryEnabled, embedsDepthDataInPhoto , embedsPortraitEffectsMatteInPhoto and embedsSemanticSegmentationMattesInPhoto in AVCapturePhotoSettings before capturing the photo using capturePhoto(with: photoSettings, delegate: self) method. I have checked manually printing the activeDepthDataFormat of AVCapture device. First before setting it by default it is Optional('dpth'/'hdis' 640x 480, { 2- 30 fps}, photo dims:{}, fov:73.699, system exposure bias range:-2.0-2.0) After forcing it to kCVPixelFormatType_DepthFloat16 or kCVPixelFormatType_DepthFloat32 the format is Optional('dpth'/'hdep' 160x 120, { 2- 30 fps}, photo dims:{}, fov:73.699, system exposure bias range:-2.0-2.0) But when I receive the captured photo in func photoOutput(_ output: AVCapturePhotoOutput, didFinishProcessingPhoto photo: AVCapturePhoto, error: (any Error)?) The depth map is Optional(hdis 640x480 (high/abs) calibration:{intrinsicMatrix: [2723.07 0.00 2016.00 | 0.00 2723.07 1512.00 | 0.00 0.00 1.00], extrinsicMatrix: [1.00 0.00 0.00 0.00 | 0.00 1.00 0.00 0.00 | 0.00 0.00 1.00 0.00] pixelSize:0.001 mm, distortionCenter:{2016.00,1512.00}, ref:{4032x3024}}) Here it shows hdis instead of hdep, why is it capturing disparity map instead of true depth map. The depth quality is high and depth data accuracy is absolute. Here is my code import UIKit import AVKit import AVFoundation class ViewController: UIViewController, AVCapturePhotoCaptureDelegate { @IBOutlet weak var previewView: UIView! @IBOutlet weak var resultLbl: UILabel! private var session = AVCaptureSession() private var captureDevice: AVCaptureDevice? private var inputDevice: AVCaptureDeviceInput? private var photoOutput: AVCapturePhotoOutput? private var photoSettings: AVCapturePhotoSettings? private var cameraPreviewLayer: AVCaptureVideoPreviewLayer? override func viewDidLoad() { super.viewDidLoad() // Do any additional setup after loading the view. self.setupCaptureSession() } func setupCaptureSession(){ captureDevice = AVCaptureDevice.default(.builtInTrueDepthCamera, for: .video, position: .unspecified) guard let captureDevice else{ print("ERROR::UNABLE TO SET TRUE DEPTH CAMERA ") return } session.beginConfiguration() do{ inputDevice = try AVCaptureDeviceInput(device: captureDevice) guard let inputDevice else{ print("ERROR: UNABLE TO SET UP INPUT DEVICE") return } if session.canAddInput(inputDevice){ session.addInput(inputDevice) } } catch{ print(error) } photoOutput = AVCapturePhotoOutput() guard let photoOutput else{ print("ERROR: UNABLE TO SET UP PHOTO OUTPUT") return } if session.canAddOutput(photoOutput){ session.addOutput(photoOutput) } session.sessionPreset = .photo photoOutput.isDepthDataDeliveryEnabled = photoOutput.isDepthDataDeliverySupported print("IS DEPTH ENABLED:: \(photoOutput.isDepthDataDeliveryEnabled)") session.commitConfiguration() let availableFormats = captureDevice.activeFormat.supportedDepthDataFormats let depthFormat = availableFormats.filter { format in let pixelFormatType = CMFormatDescriptionGetMediaSubType(format.formatDescription) return (pixelFormatType == kCVPixelFormatType_DepthFloat16 || pixelFormatType == kCVPixelFormatType_DepthFloat32) }.first session.beginConfiguration() try! captureDevice.lockForConfiguration() captureDevice.activeDepthDataFormat = depthFormat captureDevice.unlockForConfiguration() session.commitConfiguration() self.setupPreviewLayer() } func setupPreviewLayer(){ cameraPreviewLayer = AVCaptureVideoPreviewLayer(session: session) cameraPreviewLayer?.videoGravity = .resizeAspectFill if let cameraPreviewLayer{ self.previewView.layer.addSublayer(cameraPreviewLayer) cameraPreviewLayer.frame = self.previewView.bounds } DispatchQueue.global(qos: .userInteractive).async { self.session.startRunning() } } @IBAction func captureBtnPressed(_ sender: Any) { photoSettings = AVCapturePhotoSettings(format: [AVVideoCodecKey:AVVideoCodecType.jpeg]) guard let photoSettings else{ print("ERROR: UNABLE TO SETUP PHOTO SETTINGS") return } guard let photoOutput else{ print("ERROR: UNABLE TO SET UP PHOTO OUTPUT") return } photoSettings.isDepthDataDeliveryEnabled = photoOutput.isDepthDataDeliverySupported photoSettings.embedsDepthDataInPhoto = true photoSettings.embedsPortraitEffectsMatteInPhoto = true photoSettings.embedsSemanticSegmentationMattesInPhoto = true photoOutput.capturePhoto(with: photoSettings, delegate: self) } func photoOutput(_ output: AVCapturePhotoOutput, didFinishProcessingPhoto photo: AVCapturePhoto, error: (any Error)?) { print(photo.depthData) switch photo.depthData?.depthDataQuality { case .low: print("Depth quality is low") case .high: print("Depth quality is high") case nil: print("Depth quality is nil") } switch photo.depthData?.depthDataAccuracy { case .relative: print("Depth accuarcy is relative") case .absolute: print("Depth accuarcy is absolute") case nil: print("Depth accuarcy is nil") } if let imageData = photo.fileDataRepresentation(){ if let image = UIImage(data: imageData){ UIImageWriteToSavedPhotosAlbum(image, nil, nil, nil) } } } }
0
0
44
1d
PHLivePhotoEditingContext.saveLivePhoto results in AVFoundation error -11800 "The operation could not be completed" reason An unknown error occurred (-12815)
When trying to edit some Live Photos, calling PHLivePhotoEditingContext.saveLivePhoto results in the following error: Error Domain=AVFoundationErrorDomain Code=-11800 "The operation could not be completed" UserInfo={NSLocalizedFailureReason=An unknown error occurred (-12815), NSLocalizedDescription=The operation could not be completed, NSUnderlyingError=0x300d05380 {Error Domain=NSOSStatusErrorDomain Code=-12815 "(null)"}} I was able to replicate it on my device by taking a new Live Photo. Not sure what's wrong with that one specifically, not all Live Photos replicate the issue. I've submitted FB15880825 with a sysdiagnose and a Photos Diagnostics as well. Any ideas what's going on here? It's impacting multiple customers. Thanks!
0
0
85
1d
Need the information of minimum focus distance of different cameras in each iPhone model
Our app involves using the camera to scan barcodes or QR codes, with a working distance of about 5 cm. However, we’ve noticed variations in the focus distance of camera lenses across different iPhone models. Currently, we mainly use two types of lenses: wide-angle and ultra-wide-angle. • For iPhone 13 and earlier models, we use the wide-angle lens. • For iPhone 13 Pro and later models, we use the ultra-wide-angle lens. We are not certain if this setup is correct since we don’t have all iPhone models to test.
There is a users have reported focus issues on his iPhone 15. We would like to ask if there’s a resource where we can find the minimum focus distance of different cameras in each iPhone model. This is to verify whether our current configuration is accurate. Alternatively, if such data is not readily available, could apple tam advise which camera should be used on various iPhone models for scenarios with a working distance of approximately 5 cm? Thank you!
1
0
78
2d
How can I use iPhone true depth front camera to detect if the captured depth map of a face is a true 3d face or spoofed 2d image
I'm trying to implement anti-spoofing in iOS app using iphone true depth front camera. I have checked the following questions still can't find a proper working solution. I trained a coreML model using 22000 depth human face images and 22000 non-human face(objects,food etc) images. The accuracy of the model is very less. When testing out with flat 2d images shown on a smartphone screen I found that I get depth map even for flat 2D images like this. Even though the image is flat how does it give the depth map for the person shown in the flat 2D picture so the model thinks that it is a real face instead of a spoofed one. I implemented depth capture by following this documentation and I made sure that I get depth map instead of disparity map https://developer.apple.com/documentation/avfoundation/additional_data_capture/capturing_photos_with_depth My next approach was to use NCNN framework to implement anti-spoofing by using the model used in the Mini-vision android anti-spoofing sample. I rewrote their library in iOS by using the objective C++ wrapper for C++ as the sample was only available for android app. And I tested by feeding 80x80 UI-Image in a open cv matrix format it's accurracy is less than the android one. How can I solve this problem.
0
0
105
2d
Captured photos in wrong orientation
I'm building a custom camera screen that displays the camera image on a preview layer and then captures an image, using AVCaptureSession. When the picture is captured, I immediately load it into a UIImageView in order to display it to the user for approval. I've actually done this many times before, but this is the first time I've tried to do it in an app that supports interface rotation. If I hold the phone in Portrait mode and capture a picture, everything works as expected. When the user rotates the phone into Landscape orientation, I detect this and I replace the preview layer (AVCaptureVideoPreviewLayer) with a new one, specifying connection.videoRotationAngle in order to make the image appear in the right orientation. I'm a little surprised that this is necessary, and it's not a smooth transition, but that doesn't matter. What does matter is that when I capture the image, it is in the wrong orientation. I tried rotating it myself, but this doesn't seem to make any difference. What am I doing wrong?
2
0
86
2d
LockedCameraCaptureManager sessionContentUpdates sometimes is not called
Within my app, I have: for try await update in LockedCameraCaptureManager.shared.sessionContentUpdates { It seems that the first time my app opens from LockedCameraCapture (after enabling camera permissions etc...) this update is never called and the user will not see their capture (.added or .initial) If I then try to take another picture/video through my LockedCameraCapture control, it takes the video, opens the app as before, but this time sessionContentUpdates is called twice, once for the first video and once for the second video! After that it doesn't seem to occur again and all works perfectly! My device is: iPhone 16 Pro Max, iOS 18.2 developer beta Has anyone experienced this?
0
0
75
2d
How to change PHLivePhoto EXIF metadata
I have an app that allows the user to change a photo’s EXIF metadata. To do this, I request a content editing input, get the full size image, modify its properties, create a content editing output, write the output image to the rendered content URL, then call performChanges on the PHPhotoLibrary creating an asset change request for that asset setting its content editing output. This works as expected for regular photos but Live Photos get turned off converted to a regular photo. To address this, I’m doing something similar by changing the properties of the .photo image in the Live Photo. I detect when the content editing input has a Live Photo, create a Live Photo editing context, set a frame processor that returns the frame’s image after setting its properties to the updated properties when the frame type is photo, then I create the content editing output and save the Live Photo to that output. It modifies the Live Photo successfully, but the metadata is not updated. If you get the full size image again the properties are the original properties. If you look at the EXIF metadata using an app like Metapho it remains unchanged. What am I doing wrong here? Thanks! let imageURL = contentEditingInput.fullSizeImageURL! let inputImage = CIImage(contentsOf: imageURL, options: [.applyOrientationProperty: true])! var metadata: [AnyHashable: Any] = inputImage.properties // Edit the metadata as desired... let editingContext = PHLivePhotoEditingContext(livePhotoEditingInput: contentEditingInput)! editingContext.frameProcessor = { frame, error -> CIImage? in // Edit only the still photo if frame.type == .photo { return frame.image.settingProperties(metadata) } return frame.image } let contentEditingOutput = try await withCheckedThrowingContinuation { continuation in let editingOutput = PHContentEditingOutput(contentEditingInput: contentEditingInput) editingOutput.adjustmentData = adjustmentData editingContext.saveLivePhoto(to: editingOutput) { success, error in if success { continuation.resume(returning: editingOutput) } else { continuation.resume(throwing: error!) } } } try await PHPhotoLibrary.shared().performChanges { let request = PHAssetChangeRequest(for: asset) request.contentEditingOutput = contentEditingOutput }
0
0
79
3d
IOSurface with System Extensions
Hi All, I'm working on a camera system extension where the main app is supposed to transfer a video stream using IOSurface memory sharing to the cam extension. I have built a sample app that does contains all the logic, but without a camera extension. So I'm essentially using IOSurface to render a video in one SwiftUI view and show the result in another SwiftUI view. Just for testing purposes. And everything works fine so far. Now, when moving the receiver code to the camera extensions, I'm having problems in accessing the IOSurface via ID. I am sharing the IOSurface ID via UserDefaults. I know from the logs the ID is correctly transferred. Here is the code that uses IOSurfaceLookup to get the IOSurface. But this fails with the given message. The error message prints the surface ID which is the correct one. I know this from the main app where I get the ID and print it as well. private var surfaceId: Int = -1 { didSet { logger.info("surfaceId has changed") if surfaceId == -1 { stopReceivingFrames() ioSurface = nil } else { guard let surface = IOSurfaceLookup(IOSurfaceID(surfaceId)) else { logger.error("failed to lookup IOSurface with ID: \(self.surfaceId)") return } self.ioSurface = surface logger.info("surface set, now starting receiving frames") startReceivingFrames() } } } My gut feeling says that this issue might be related to some missing entitlement, sandboxing. In general, I have a working camera extension. I'm just not able to render a video in the main app, and send it over to the camera extension to overlay another web cam. Both, the main app and camera extension are in the same XCode workspace and share the same AppGroup. In short, my actual questions are: Is there any entitlement required for using IOSurface between app and camera system extension? Is using IOSurface actually possible in system extensions? Is there any specific setting/requirement that I need to handle to make this work?
0
0
83
3d
Crash when presenting Camera via Web View in iOS 18.2 Beta - WebCore::AVVideoCaptureSource::create
We are experiencing thousands of crashes in our application when attempting to present the camera through a Web View. The app crashes during this process, and the crash logs point to WebCore::AVVideoCaptureSource::create WebCore::RealtimeMediaSourceCenter::getUserMediaDevices. This issue has only been observed in iOS 18.2 beta versions (beta 1 - 22C5109p, beta 2 - 22C5125e, beta 3 - 22C5131e). In iOS versions below 18.2, the functionality works and we haven't identified any correlation with specific device models. The problem seems to stem from a WebCore framework introduced in these beta releases 18.2. We kindly request a review and fix for this issue in upcoming beta releases to restore functionality. Let us know if there are any workarounds or adjustments we can implement in the interim. Thank you for your attention to this matter.
2
0
159
4d
App randomly be terminated due to Capture Application Requirements Unmet
Hi. I encounter some random crashes of my camera app. After some investigations, I found that it's terminated by the system and the crash log did be generated but the information is not quite useful, and here is the log found via the Console app. Termination & Crash log "Camera not actively used; AVCaptureEventInteraction not installed": Received termination request from [osservice<com.apple.SpringBoard>:10931] on <RBSProcessPredicate <RBSProcessInstancePredicate| [app<com.juniperphoton.PhotonCam]>> with context <RBSTerminateContext| explanation:Capture Application Requirements Unmet: "Camera not actively used; AVCaptureEventInteraction not installed" reportType:CrashLog maxTerminationResistance:Interactive> The crash log exported from the device will have some common information like: It's a EXC_CRASH (SIGKILL) type with no termination reason. Exception Type: EXC_CRASH (SIGKILL) Exception Codes: 0x0000000000000000, 0x0000000000000000 Termination Reason: RUNNINGBOARD 0 It's triggered by the main thread, but it seems to be waiting for an event to process. Triggered by Thread: 0 Thread 0 name: Dispatch queue: com.apple.main-thread Thread 0 Crashed: 0 libsystem_kernel.dylib 0x1ee165788 mach_msg2_trap + 8 1 libsystem_kernel.dylib 0x1ee168e98 mach_msg2_internal + 80 2 libsystem_kernel.dylib 0x1ee168db0 mach_msg_overwrite + 424 3 libsystem_kernel.dylib 0x1ee168bfc mach_msg + 24 4 CoreFoundation 0x19cbe47f4 __CFRunLoopServiceMachPort + 160 5 CoreFoundation 0x19cbe3ea0 __CFRunLoopRun + 1212 6 CoreFoundation 0x19cc36274 CFRunLoopRunSpecific + 588 7 GraphicsServices 0x1e9d6d4c0 GSEventRunModal + 164 8 UIKitCore 0x19f783480 -[UIApplication _run] + 816 9 UIKitCore 0x19f3a9410 UIApplicationMain + 340 10 UIKitCore 0x19fae4bb0 0x19f394000 + 7670704 11 PhotonCam 0x1002e7e3c 0x1002cc000 + 114236 12 dyld 0x1c2d5ade8 start + 2724 Address size fault on the main thread Thread 0 crashed with ARM Thread State (64-bit): ... far: 0x0000000000000000 esr: 0x56000080 Address size fault I have once tried to reproduce this issue when the app is attached with debugger, and it says: Terminated due to signal 9 When the crash or termination happened, the app: No AVCaptureSession is running. The app is in the foreground and users are interacting with some functions like viewing photos or editing photos in the app. When users exit the camera view, like entering the gallery or settings, the camera session will be stopped. Both TestFlight and Debug build will have the same issue. No 3rd party crash reporter is installed(I deliberately disable it in Debug build and TestFlight build) It has adopted the LockedCameraCapture, but current it's running on the main app target(if not, my app will have a button of unlock, so I can confirm about this). Also, when it comes to the memory consumption, there is no JetsamEvent around the crash time. Device and app information Additionally, some information about the tech stack and the current state of my device and my app: iPhone 16 Pro with iOS 18.2 Beta 3. The app is a camera based app(it's PhotonCam and you can find it on the App Store), its main functionality is the camera feature using AVFoundation + Core Image + Metal to deliver camera functionality. It has adopted the Camera Control, AVCaptureEventInteraction and LockedCameraCapture features. If I remember it right, it occurs in iOS 18.1 Release build, but currently I have no such device to confirm. But in iOS 17.x the issue has never happened. Regarding to this termination, on top of my head is the "watchdog" mechanism that will terminate the process that is running on the LockedCameraCapture feature. However I can make sure that currently the app is running as the main target on the home screen. Has anybody encountered this kind of issue and has found some solutions? Thanks in advance.
1
0
163
4d
LivePhoto not applying on iOS 17+ as live wallpaper
Hi fellow iOS developers! 👋 I've written a Swift code that converts a video (from a URL) into a Live Photo after downloading it. The conversion process seems fine, but when I try to set the generated Live Photo as a wallpaper on iOS 17+, it shows the message 'Motion not Available.' Has anyone else experienced this issue or know why this might be happening? Could it be related to changes in iOS 17 Live Photo handling or the generated file structure? Any help or suggestions would be greatly appreciated! 🙏
0
0
61
5d
EXIF creation date of ICCameraFile always nil?
I am using ImageCaptureCore to access and (sometimes) download media files from a digital camera connected via USB (either to a Mac oder to an iOS device with Apple lightning to USB3 camera adapter). This works very well in general, but what puzzles me is that for the ICCameraFile's EXIF creation/modification date, it always returns nil. I can access the ICCameraItem's creation/modification date instead, which, as it says in the documentation "usually [is] the same as its EXIF creation date", but, well not always. Generally the EXIF tags are more reliable than the file dates, especially the modification date is easily messed up when copying files. As for my cameras, they show the stable EXIF date on their display, so for consistency I would prefer to use the same in my app. Is there a way to get it without downloading the image from the camera and reading it from the file? Does it possibly depend on the brand of camera (I mostly have Canon) whether ICCameraFile.exifCreationDate is ever populated or always nil? For a thumb drive with DCIM folder, which is treated just like a camera, it is also nil.
2
0
180
1w
Replace MWPhotoBrowser with modern alternative
I have an iPad app, written in objective-c and distributed through Enterprise developer, as it is not for public use but specific to some large companies. The app has a local database and works offline For some functions of the app I need to display images (not edit or cut them, just display them) Right now there is integrated MWPhotoBrowser viewer, which has not been maintained for almost 10 years, so in addition to warnings in compilation I have to fight with some historical bugs especially on high resolution images. https://github.com/mwaterfall/MWPhotoBrowser Do you know of a modern and maintained OFFLINE photo viewer? I evaluate both free and paid (maybe an SDK). My needs are very basic I have found this one https://github.com/TimOliver/TOCropViewController, but I need to disable the photos edit features and especially I would lose the useful feature of displaying multiple images (mwphoto for multiple images showed a gallery)
0
0
117
1w
Darkish line on Photos app. Not a hardware issue.
Hey, There's like this darkish line on my iPhone and iPad when I open the Photos app. This scared the ding dong out of me the first time I saw it but then I realized in was a software issue when it disappeared as I swiped up to close the app. It's really weird because it's extremely faint but I can't seem to catch it in screenshots. I know for a fact this is a software issue because it doesn't show up in any other apps. It also changes from horizontal to vertical depending on how I turn my iPhone. Can everyone please just check your own iPhone or iPad to make sure I'm not the only one? I'm on the 18.2 developer beta by the way. Thanks!
1
0
82
1w
Cannot assign AVCaptureDevice to SCNMaterialProperty.contents
I want to apply a SCNTechnique pipeline to the camera feed. To achieve this, I want to bring the camera input into the SceneKit world. The perfects API seems to be: let captureDevice = … scnScene.background.contents = captureDevice This is demonstrated in "SceneKit: What's New" (WWDC17) (at 44m19s) and is mentioned in the documentation of SCNMaterialProperty's contents. Instead of showing camera feed, it crashes with these messages: *** Terminating app due to uncaught exception 'NSInvalidArgumentException', reason: '*** -[AVCaptureVideoDataOutput setVideoSettings:] Unsupported pixel format type - use -availableVideoCVPixelFormatTypes' *** First throw call stack: (0x18993c7cc <REDACTED> 0x211e18488) libc++abi: terminating due to uncaught exception of type NSException Please advise. STEPS TO REPRODUCE Create a new Xcode project, starting from the SceneKit game template. Add Info.plist entry for NSCameraUsageDescription. Add a capture device property to GameViewController: class GameViewController: UIViewController { let captureDevice = AVCaptureDevice.default(for: .video) Set the background contents: scene.background.contents = captureDevice Run the app on device. PLATFORM AND VERSION iOS Development environment: Xcode 16.1, macOS 15.0.1. Run-time configuration: iOS 18.1
1
0
177
2w
CIImage property of UIImage is always nil
I'm trying to apply a Core Image filter to an UIImage. For that I want to get the CIImage format of the UIImage. I'm trying to obtain the CIImage of the UIImage as shown below. if let inputImage = self.orginalImageView.image{ if let ciImage = CIImage(image: inputImage){ print(ciImage) print(self.orginalImageView.image?.ciImage) } } } This method works. But one thing I noticed is that there is already a ciImage property and it inside UIImage and it is always nil. According to documentation ciImage The underlying Core Image data. var ciImage: CIImage? { get } Discussion If the UIImage object was initialized using a CGImage, the value of the property is nil. Does accessing image property of UIImage comes from CGImage so that the ciImage porperty is nil?
1
0
134
2w
Different information values depending on how the metadata of the image is obtained (PHAsset vs PHPickerResult)
While customizing ImagePicker and using it, we find out that the metadata is not reflected normally and report it. The situation is as follows. The time or time zone of an image is changed in the Photos app. Changing the time zone of an image with an actual capture date of 2024:11:08 08:27:44 → 2024:11:07 17:27:44 Image data is extracted from a PHAsset using PHImageManager. The metadata is obtained from this image data. The time zone information exposed in the Exif tag information does not reflect the time or time zone changed in the Photos app. let asset: PHAsset = ... .... let options = PHImageRequestOptions() options.isSynchronous = true options.version = .current options.deliveryMode = .highQualityFormat options.resizeMode = .none options.normalizedCropRect = .zero options.isNetworkAccessAllowed = true options.progressHandler = { progress, error, _, _ in } PHImageManager.default().requestImageDataAndOrientation(for: asset, options: options) { imageData, uti, orientation, info in let cgImageSource = CGImageSourceCreateWithData(imageData! as CFData, nil) let properties = CGImageSourceCopyPropertiesAtIndex(cgImageSource!, 0, nil) as? Dictionary&lt;String, Any&gt; let exif = properties!["{Exif}"] let dictionary = exif as? Dictionary&lt;String, Any&gt; } Metadata Check In this case, it is reflected in the creationDate of PHAsset, so it can be somewhat compensated by forcibly replacing the metadata. However, because PHAsset does not include time zone information, when changing the time zone as well, it's impossible to calculate the correct time according to the time zone. PHPicker This issue is resolved when using the PHPickerResult provided by PHPicker. extension PhotosPickerViewController: PHPickerViewControllerDelegate { public func picker(_ picker: PHPickerViewController, didFinishPicking results: [PHPickerResult]) { ..... for result in results { let identifier = UTType.image.identifier if result.itemProvider.hasItemConformingToTypeIdentifier(identifier) { result.itemProvider.loadDataRepresentation(forTypeIdentifier: identifier) { data, error in guard let data = data, let cgImageSource = CGImageSourceCreateWithData(data as CFData, nil), let properties = CGImageSourceCopyPropertiesAtIndex(cgImageSource, 0, nil) as? Dictionary&lt;String, Any&gt;, let exif = properties["{Exif}"], let dictionary = exif as? Dictionary&lt;String, Any&gt; else { return } } } } } } Metadata Check Question I wonder why this happens, and if this is normal behavior. Instead of the System Picker that Apple provides as a base, I wonder if there is any way I can supplement it in that situation if I use a customizer.
0
0
222
2w
Is it possible to retrieve EXIF metadata from PHAsset without downloading photos (even if offloaded to iCloud Photo Library)?
iOS (Official) Photos app can display some EXIF-related metadata (e.g. camera and lens info, ISO, shutter speed, F-number) even when photos are offloaded to iCloud and the device is not connected to internet (e.g. airplane mode). However, with the Photos.framework, we need to download photos to retrive those metadata (which means it will not work with airplane mode). I tried the following methods, but none of those worked when photos were offloaded to iCloud and the device was in airplane mode: Requesting data with PHImageManager.default().requestImageDataAndOrientation Result: It does not return Data if the photo is not stored locally on the device, even with options.deliveryMode = .fastFormat Converting PHAsset#localIdentifier to an AssetsLibrary.framework URL (assets-library://asset/...) (I am aware that AssetsLibrary.framework is deprecated, but this was just a test.) Result: If PHImageManager does not returns Data, ALAsset#defaultRepresentation().metadata() returns an empty NSDictionary
0
1
275
2w
Issues with capturing bracketed photos using iPhone 16 Pro
I am experiencing a bug when using a AVCapturePhotoBracketSettings object to capture a bracketed photo sequence on iPhone 16 Pro. Specifically, when I pass in an array of exposure values: [-x, 0, +x], where x >= 3. Specifically, the high exposure photo capture returns a black image. STEPS TO REPRODUCE Run the sample app I have provided on an iPhone 16 Pro Notice that bracketed images captured where the eV is set to [-3,0,+3], [-4,0,+4], or [-5,0,+5] return a black image for the high exposure photo. Notice that on other iOS devices (like iPhone 13 Pro), the high exposure photo is returned as high brightness as expected. I have also added two folders in the sample project that show screenshots of the bug: iPhone13Pro & iPhone16Pro Sample Project: https://www.icloud.com/iclouddrive/090O_68Z0Nh2UOxmPRwu56Tmw#Focused16ProBracketedCaptureBug
1
0
201
2w