Photos & Camera

RSS for tag

Explore technical aspects of capturing high-quality photos and videos, including exposure control, focus modes, and RAW capture options.

Posts under Photos & Camera subtopic

Post

Replies

Boosts

Views

Activity

Images with unusual color spaces not correctly loaded by Core Image
Some users reported that their images are not loading correctly in our app. After a lot of debugging we identified the following: This only happens when the app is build for Mac Catalyst. Not on iOS, iPadOS, or “real” macOS (AppKit). The images in question have unusual color spaces. We observed the issue for uRGB and eciRGB v2. Those images are rendered correctly in Photos and Preview on all platforms. When displaying the image inside of a UIImageView or in a SwiftUI Image, they render correctly. The issue only occurs when loading the image via Core Image. When comparing the different Core Image render graphs between AppKit (working) and Catalyst (faulty) builds, they look identical—except for the result. Mac (AppKit): Catalyst: Something seems to be off when Core Image tries to load an image with foreign color space in Catalyst. We identified a workaround: By using a CGImageDestination to transcode the image using the kCGImageDestinationOptimizeColorForSharing option, Image I/O will convert the image to sRGB (or similar) and Core Image is able to load the image correctly. However, one potentially loses fidelity this way. Or might there be a better workaround?
1
2
11
2h
PHPickerResult return different data for the one media
On some devices, when i select the same media multiple times, the data by` loadFileRepresentation(forTypeIdentifier: completionHandler) ` returned is different(data.count is not equal). environment: * Model: iPhone 12 * Model Number: MGGM3CH/A * iOS Version: 18.3.2 ```Swift // import PhotosUI func picker(_ picker: PHPickerViewController, didFinishPicking results: [PHPickerResult]) { picker.dismiss(animated: true, completion: nil) guard let provider = results.last?.itemProvider else { return } guard provider.hasItemConformingToTypeIdentifier(UTType.movie.identifier) else { return } Task { provider.loadFileRepresentation(forTypeIdentifier: UTType.movie.identifier) { url, error in guard let url = url else { return } if let data = try? Data(contentsOf: url) { print("data count is: \(data.count)") } } } } ``` ps: I also try some other function, eg: ` provide.loadItem(forTypeIdentifier:)`, but not work too.
1
0
7
2d
PHPickerResult take loong time to load large video
On some devices, loadFileRepresentation(forTypeIdentifier: completionHandler) take a loong time(about two minute) to callback result for some large video(about 200 MB, take by device camera). environment: Model: iPhone 12 Model Number: MGGM3CH/A iOS Version: 18.3.2 PHPickerResult.NSItemProvider.loadFileRepresentation() // import PhotosUI func picker(_ picker: PHPickerViewController, didFinishPicking results: [PHPickerResult]) { picker.dismiss(animated: true, completion: nil) guard let provider = results.last?.itemProvider else { return } guard provider.hasItemConformingToTypeIdentifier(UTType.movie.identifier) else { return } Task { provider.loadFileRepresentation(forTypeIdentifier: UTType.movie.identifier) { url, error in guard let url = url else { return } // Do some stuff... } } } ps: I also try some other function, eg: provide.loadItem(forTypeIdentifier:), but not work too.
1
0
18
2d
Dockkit custom tracking does not work on iOS18.3
Hi all, I'm using Apple Sample Code below to create application using dockkit. "Controlling a DockKit accessory using your camera app" https://developer.apple.com/documentation/dockkit/controlling-a-dockkit-accessory-using-your-camera-app?changes=_8 I used vision hand recognition and put the observation data to dockAccessory.track, but Belkin or Insta360 devices never move on iPhone 16 Pro Max with iOS 18.3. If I use other functions like face search (system tracking) in the app, those work ok. I used Belkin and Insta360 Flow 2 Pro to reproduce the problem. My friend is also saying that the custom tracking feature was working fine on the OS 18 beta, but on recent iOS 18.3 that feature does not work. If I can get the iOS 18.0 beta then we can test that feature. But I cannot revert my iOS from 18.3 to the iOS 18.0 Beta. Regards, TO
0
0
21
6d
When getting the PHPickerResult from user selecting media in the Photos app, how to check file extension?
when I get results from picker: PHPickerViewController, didFinishPicking results: [PHPickerResult]) and I load the image using itemProvider .loadFileRepresentation (the itemProvider is the NSItemProvider provided by the PHPickerResult) will the url that's returned by this method be guaranteed to have the file extension ie, "file://image.jpeg" not "file://image" I want to know if i need to just check the extension to know its file type. (FYI in case this makes a difference, im only interested in user screenshots and screenrecordings)
6
0
33
6d
Unable to Fetch Videos from Recently Deleted Album Using Photos Framework in iOS 18.3.1
Hello everyone, I’m working on an iOS app that fetches videos from the "Recently Deleted" album using the Photos framework in Swift. However, I’m unable to fetch any videos, even though the "Recently Deleted" album contains 233 items (including videos), as seen in the Photos app. Environment: iOS Version: 18.3.1 Xcode Version: 16.2 Swift Version: Swift 5 Device: iPhone (simulator and physical device both tested) Photo Library Permission: "All Photos" access granted Recently Deleted Lock: Face ID/Passcode is disabled for "Recently Deleted"
2
0
36
2w
Moving photos to a shared library programmatically
Hello everyone, I am looking for a solution to programmatically, e.g. using AppleScript to import photos into the Photos library on MacOS and also push them to the shared library, like it can be done using the standard GUI of the Photos application. Maybe it is not possible using AppleScript, but using a short Swift script and PhotoKit, I do not not know. Any help is appreciated! Thomas
5
0
72
2w
DockKit gimbal reported yaw drifts by upwards of 45 degrees after running for a while
This is an issue with the Insta360 Flow Pro 2. My iOS app uses DockKit to control the gimbal; in particular, my app disables tracking and sends angular velocity commands to control the gimbal's orientation. I only try to modify the yaw (rotation around the vertical axis); never the pitch or yaw. Note that I don't send the gimbal to a particular orientation directly; I modify the velocity. Everything works great for a long period of time: typically for a continuous run of 4-6 hours; in the most recent case, I managed about 36 hours of continous operation before the following problem occurred. I came back to check on the system, and because no visual activity had occurred in the camera's field of view for a while, the phone had commanded the gimbal to rotate back to a yaw angle of 0 degrees. So the phone in the gimbal should have been looking straight ahead (i.e. the 0 degree yaw position), but it was definitely looking off at an angle. I've seen this twice now. The first time, when it should have been looking straight ahead, it was in fact looking 60 degrees off center. This time (caught on video, see below), it was off by 22 degrees from center. Here's the weird part: the gimbal reports this way off center positioning as zero degrees (well close enough to zero, like 0.2 or something that's fine). But, mechanically, the gimbal still knows where zero degrees is: if we double click on the trigger of the Flow Pro 2, which is supposed to reset the gimbal to 0 degrees yaw and pitch, the gimbal responds correctly and reorients to a 0 degree position. However, the yaw values it reports are not zero, but as shown in my video, 22 degrees off axis or so. Power cycling the gimbal and restarting immediately fixes the problem. Also, I switched from my app to the Insta360 app, which caused the phone to flip from landscape to portrait, then when I returned to my app and switched back to landscape, the gimbal now started reporting correct yaw angles. Is there a possibility this is a bug in the DockKit framework? Has anyone seen this? I have a case open with Insta360, but although it's clearly a software issue, it's not clear if it's in Insta360's code or the DockKit layer. Any ideas for how I can get out of this mode? My concern is that the phone is in a tripod about 10' off the floor, and not very accessible. Also, if all goes well, we may have about 50 of these systems running, and having to fix them one by one after a few hours is not good. For a demonstration of this bug, see the following video: https://octoparry.com/offset.MOV Any help greatly appreciated.
2
0
283
2w
Create live photo throw Optional(Error Domain=PHPhotosErrorDomain Code=-1 "(null)")
I want to create a Live Photo. The project includes a .jpg image and a .mov video (2 seconds). I am sure they are correct. Two permissions in xcode have been added: Privacy - Photo Library Usage Description Privacy - Photo Library Additions Usage Description Simulate: iphone 16, ios 18.3 The codes in ContentView.swift : private func saveLivePhoto(imageURL: URL, videoURL: URL, completion: @escaping (Bool, Error?) -> Void) { PHPhotoLibrary.shared().performChanges { let creationRequest = PHAssetCreationRequest.forAsset() let options = PHAssetResourceCreationOptions() options.shouldMoveFile = false creationRequest.addResource(with: .photo, fileURL: imageURL, options: options) creationRequest.addResource(with: .pairedVideo, fileURL: videoURL, options: options) } completionHandler: { success, error in DispatchQueue.main.async { print(error) completion(success, error) } } } guard let imageURL = Bundle.main.url(forResource: "livephoto", withExtension: "jpeg"), let videoURL = Bundle.main.url(forResource: "livephoto", withExtension: "mov") else { showAlertMessage(title: "error", message: "cant find Live Photo ") return } print("imageURL: \(imageURL)") print("videoURL: \(videoURL)") saveLivePhoto(imageURL: imageURL, videoURL: videoURL) { success, error in if success { xxxxx } else { xxxxx } } Really need help, thanks
1
0
132
2w
Create live photo error
I want to create a Live Photo. The project includes a .jpg image and a .mov video (2 seconds). Two permissions in xcode have been added: Privacy - Photo Library Usage Description Privacy - Photo Library Additions Usage Description Simulate: iphone 16, ios 18.3 The codes in ContentView.swift : private func saveLivePhoto(imageURL: URL, videoURL: URL, completion: @escaping (Bool, Error?) -> Void) { PHPhotoLibrary.shared().performChanges { let creationRequest = PHAssetCreationRequest.forAsset() let options = PHAssetResourceCreationOptions() options.shouldMoveFile = false creationRequest.addResource(with: .photo, fileURL: imageURL, options: options) creationRequest.addResource(with: .pairedVideo, fileURL: videoURL, options: options) } completionHandler: { success, error in DispatchQueue.main.async { print(error) completion(success, error) } } } guard let imageURL = Bundle.main.url(forResource: "livephoto", withExtension: "jpeg"), let videoURL = Bundle.main.url(forResource: "livephoto", withExtension: "mov") else { showAlertMessage(title: "error", message: "cant find Live Photo ") return } print("imageURL: \(imageURL)") print("videoURL: \(videoURL)") saveLivePhoto(imageURL: imageURL, videoURL: videoURL) { success, error in if success { xxxxx } else { xxxxx } } Really need help, thanks
0
0
128
2w
Capturing & Processing ProRaw 48MP images is slow
Hey, I have a camera app that captures a ProRaw photo and then runs a few Core Image filters before saving it to the device as a HEIC. However I'm finding that capturing at 48MP is rather slow. Testing a minimal pipeline on an iPhone 16 Pro: Shutter press => file received in output: 1.2 ~ 1.6s CIRawFilter created using photo file representation then rendered to context, without any filters: 0.8s ~ 1s Saving to device ~0.15s Is this the expected time for capturing processing? The native camera app seems to save the images within half a second. I'm using QualityPrioritization.balanced and the highest resolution available which is 48MP. Would using the CIRawFilter with the pixelBuffer from the photo output be faster? I tried it but couldn't get it to output an image. Are there any other things I could try to speed this up? Is it possible to capture at 24MP instead? Thanks, Alex
1
0
272
2w
Prevent iOS from Switching Between Back Camera Lenses in getUserMedia (Safari/WebView, iOS 18)
I’m developing a hybrid app (WebView / Turbo Native) that uses getUserMedia to access the back camera for a PPG/heart rate measurement feature (the user places their finger on the camera). Problem: Even when I specify constraints like: { video: { deviceId: '...', facingMode: { exact: 'environment' }, advanced: [{ zoom: 1.0 }] }, audio: false } On iPhone 15 (iOS 18), iOS unexpectedly switches between the wide, ultra-wide, and telephoto lenses during the measurement. This breaks the heart rate detection, and it forces the user to move their finger in the middle of the measurement. Question: Is there any way, via getUserMedia/WebRTC, to force iOS to use only the wide-angle lens and prevent automatic lens switching? I know that with AVFoundation (Swift) you can pick .builtInWideAngleCamera, but I’m hoping to avoid building a custom native layer and would prefer to stick with WebView/JavaScript if possible to save time and complexity. Any suggestions, workarounds, or updates from Apple would be greatly appreciated! Thanks a lot!
1
0
276
2w
iPhone 16 Camera Control and AVCaptureSlider – Is there a way to detect which slider is active?
I am following the Apple sample code and trying to add a manual focus lens position slider: @available(iOS 18.0, *) private func addCameraControls() { if !self.session.controls.isEmpty { for control in self.session.controls { self.session.removeControl(control) } } self.cameraControlFocusSlider = nil //Focus Slider if self.videoDevice!.isLockingFocusWithCustomLensPositionSupported { self.cameraControlFocusSlider = AVCaptureSlider("Focus", symbolName: "dot.square", in: 0.0...1.0) self.cameraControlFocusSlider!.setActionQueue(self.sessionQueue) { focusValue in //Do manual focus } if self.session.canAddControl(self.cameraControlFocusSlider!) { self.session.addControl(self.cameraControlFocusSlider!) } } } So there are these AVCaptureSessionControlsDelegate methods: final func sessionControlsDidBecomeActive(_ session: AVCaptureSession) { print ("sessionControlsDidBecomeActive") } final func sessionControlsWillEnterFullscreenAppearance(_ session: AVCaptureSession) { print ("sessionControlsWillEnterFullscreenAppearance") } final func sessionControlsWillExitFullscreenAppearance(_ session: AVCaptureSession) { print ("sessionControlsWillExitFullscreenAppearance") } final func sessionControlsDidBecomeInactive(_ session: AVCaptureSession) { print ("sessionControlsDidBecomeInactive") } So when self.cameraControlFocusSlider is presented, I have to show the current value of the lense position. Lens position can change from auto focus and also from manual focus by the user using the app UI. Is there a way to see if self.cameraControlFocusSlider is active or being used? Please note that I will have more than one AVCaptureSlider in the final code.
0
0
310
3w
Crash in Photos framework
Hi This is one of our top crashes. It does not contain any of our code in the stacktrace and we can't reproduce it. Those points make this crash very hard to understand and fix. We know that most of the crashes are happening on iPhone 13 with iOS 18.x.x. Also we see that a lot of cases happen when app goes into background (stacktrace contains -[UIApplication _applicationDidEnterBackground]). 2025-03-04_16-06-00.3670_-0500-6a273c7d5da97f098b5cc24898bb9761dc45208e.crash 2025-03-04_20-21-08.6609_-0500-2c08f640900f8a62c4f8a4f6f2a61feb052e66dd.crash 2025-03-04_20-46-27.7138_+0000-4d7ea89b1b564eda22ca63e708f7ad3909c7b768.crash
2
0
330
3w
Above Xcode16 operation project, in the project use AVPictureInPictureController opportunities (PIP) function open system blackout
I found that when the development tool above Xcode16 ran my app, I opened the suspended inscription function, and then opened the system camera, the content in the suspended window would not be displayed, and the suspended window would have a black screen. However, this phenomenon does not appear on Xcode15.4 development tools, it is the same code, I do not know why
2
0
472
Feb ’25
Accessing AV External Storage
Is it possible to use the AVExternalStorageDevice to access external storage from a connected camera or usb drive (via USB C or Lightning connector) on an iPad/iPhone. I have tested the following code on an iPhone 14 (iOS 18.1.1) and an iPad Gen 10 (18.3.1), and both return false for: // returns false on iPhone 14, iPad gen 10 print(AVExternalStorageDeviceDiscoverySession.isSupported) The following code returns null, when I try to access the external storage discovery session. // returns null on iOS devices print(AVExternalStorageDeviceDiscoverySession.shared) The following returns false, without displaying a permission dialog: AVExternalStorageDevice.requestAccess(completionHandler: { (granted: Bool) in // returns false with no permission dialog print(granted); What type of iOS devices are supported by AVExternalStorageDeviceDiscoverySession? What situations has it been used for (e.g. connecting to Camera via the external storage protocol, accessing photos from a SD card with an adapter, accessing photos from usb drive). Is there are sample code for using the AV External Storage api?
0
0
333
Feb ’25
Distortion corrected images
Hi, Currently I am developing a 3D reconstruction project. Which requires images to be distortion-free (rectilinear) and with known intrinsics. The session I am developing on is a builtInDualWideCamera, with isGeometricDistortionCorrectionEnabled set to false to be able to get the intrinsic matrix of the images, isVirtualDeviceConstituentPhotoDeliveryEnabled set to true and isAutoVirtualDeviceFusionEnabled set to false to get both images and isCameraCalibrationDataDeliveryEnabled set to true to actually get the calibration data. The distortion correction parameters such as lensDistortionLookupTable are used. The 42 coefficients mapping array is used as described in the AVCameraCalibrationData header file. A simple piecewise linear interpolation. There are two questions I would like to get support on: A way to set the calibration parameters in each image. I have an approach that sets the parameters in the kCGImagePropertyExifDictionary -> "UserComment". Is there a better approach to write calibration parameter data into the images? I feel like this is a bit dirty and there might be a better and neat approach. For the ultra-wide angle camera's images, the lensDistortionLookupTable contains several zeros at the end of the array. For example (last 10 elements are zero): "LensDistortionLookupTable":"0.000000000000000,0.000349554029526,0.001385628827848,0.003071037586778,... ,0.000000000000000,0.000000000000000,0.000000000000000,0.000000000000000,0.000000000000000,0.000000000000000,0.000000000000000,0.000000000000000,0.000000000000000,0.000000000000000" The problem comes when the complete array is used to correct the image (including zeros), the end result is a wrapped-like-circle image close to the edges of it which is completely wrong. In contrast, if the LensDistortionLookupTable is used without the last zeros and the new size accommodated the image looks better (although not as rectilinear as if you take the image from the iPhone's camera app), but definitely less distorted. Including zeros (full array): Excluding zeros (array size changed): Am I missing an important point in the usage of the lensDistortionLookupTable where this case is addressed (zeros at the end)? What is the criteria to shrink/exclude elements of the array? Any advice is very much welcome.
0
0
334
Feb ’25