Photos & Camera

RSS for tag

Explore technical aspects of capturing high-quality photos and videos, including exposure control, focus modes, and RAW capture options.

Posts under Photos & Camera subtopic

Post

Replies

Boosts

Views

Activity

Format of 14-bit RAW bayer data from lower bit camera sensor?
I'm working on an application that uses the iPhone camera for scientific purposes - and, as a result would like to receive sensor data in as unprocessed format as possible. I'm using AVCapturePhotoOutput to take Bayer RAW stills and receiving data in kCVPixelFormatType_14Bayer_RGGB format. However, I'm puzzled as to the content of the bits. I simply demosaic the image by taking each 2x2 square: RG GB and use R, (G+G)/2, B to get 16-bit RGB values - and this indeed works. However, I am puzzled as to the values we are getting as they seem to be approximately in the range 2048 - 16383. The top value is understandable - the maximum that you can fit in 14-bits (as implied by the pixel format type). However we don't seem to be able to get lower than ~2048 no matter how black/dark we make the sensor. I'm aware that the sensor is probably not 14-bits (we're using the iPhone 16e camera) and that maybe this is to do with the way the sensor data is packaged. The Advances in iOS Photography video (https://developer.apple.com/videos/play/wwdc2016/501/) describes it as "10-bit sensor RAW packaged in 14 bits per pixel instead of eight." Is there any documentation describing what is going on here? It's vital for our use that we get as close to the raw camera sensor light readings as possible, so any pointers as to the mapping (e.g. decompanding?) being used would be extremely useful. Many thanks in advance for your help.
3
0
130
May ’25
How to edit gain map image to preserve HDR in Live Photo
I have an app that allows you to edit your photos. To preserve HDR, I edit both the SDR image and gain map image, like so: let sdrImage = CIImage(data: data, options: [.applyOrientationProperty: true]) let gainMapImage = CIImage(data: data, options: [.applyOrientationProperty: true, .auxiliaryHDRGainMap: true]) // edit them... try CIContext().writeHEIFRepresentation(of: sdrImage, to: url, format: .RGBA8, colorSpace: colorSpace, options: [.hdrGainMapImage: gainMapImage]) I also support editing the still photo in Live Photos. To do this you create a PHLivePhotoEditingContext, set the frameProcessor block which gives you a CIImage that I edit when the frame.type is .photo, then you create a PHContentEditingOutput and call saveLivePhoto. I’m not seeing any way to preserve HDR here. Interestingly the frame processor is called twice with .photo frame.type, but I don’t see any difference between these images. How can I edit a gain map image to preserve HDR in the still photo of a Live Photo?
1
0
726
Nov ’24
Failed to launch Photo Editing Extension from Mac Catalyst app
I have an iOS app that includes a Photo Editing Extension and is optimized for Mac Catalyst so you can edit photos in the Photos app on your Mac. This has worked really well but now I am encountering an error alert trying to open the photo editing extension: RBSLaunchRequest error trying to launch plugin com.company.TestEditor. TestPhotoEditor (B7A616A7-2 5A8-4E02-8B32-5CAB37C8B4B2): ErrorDomain=RBSRequestErrorDomain Code=5 "Launch failed." UserInfo={NSLocalizedFailureReason=Launch failed., NSUnderlyingError=0x7f08fafd0 {ErrorDomain=NSPOSIXErrorDomain Code=153 "Unknown error: 153" UserInfo={NSLocalizedDescription=Launchd job spawn failed}}} Create a new iOS app project in Xcode Create a new target and choose iOS > Photo Editing Extension For both targets in the project, add Mac Catalyst as a supported destination Run the app on My Mac (Mac Catalyst) Open the Photos app, double click a photo, click Edit, click the more plugins button, and click TestPhotoEditor in the list macOS 15.4.1 + Xcode 16.3
1
0
144
May ’25
Customized IOKit extension not work
Hi guys, How to achieve the following feature on macOS when a USB device (Camera/Mic/Speaker) is connected: When a third-party video conferencing app is not in a meeting, ensure the app defaults to using the USB device (Camera/Mic/Speaker). When a third-party conferencing app is in a meeting, ensure the app automatically switches to the USB device (Camera/Mic/Speaker). I want to make use of IOKit extension to hidden or ignore build-in camera to realize the requirement. however the extension can't be loaded for Invalid permissions in MacOS 15.4.1, buildVersion:24E263. I also tried to run in in MacOS 14.4.1, which can be loaded but can't auto load when restart laptop as KDK version not match. Could you please give me some suggestion? Is it possible hidden build-in camera in MacOS M-series chip? Is there any other method to realize the feature. Thanks a lot.
7
0
251
Jun ’25
Use Vision framework to detect a graph in Swift
I would like to offer the functionality that the user aims the camera at a graph (including axes and scales) and the app detects the graph and the app replicates the graph using the image. I have the whole camera setup finished with a AVCaptureSession, VNDetectContoursRequest, VNImageRequestHandler, etc. However, now I get many many results so I guess I will now need to tell the image processing process what I am looking for. i.e. filter the VNContoursObservations. I 'think' I first need to detect two perpendicular lines (the two axes). How do I do that? If I do not see them, I can just ignore that input and wait for the next VNContoursObservation. When I found the axes of the graph, I will need to find the curve (graph) that I need to scan. Any tips on how I can find that curve and turn that curve into a bunch of coordinates? Thanks! Wouter
1
0
631
Oct ’24
Message from com.apple.photos.backend (PhotoKit) in log
Hello Our application is backing up the user photos to some back end. When retrieving the asset data from the Photo Library, we set the flag 'accessNetworkAllowed' to true to get the assets that might be optimized in iCloud. In the application logs, we can see the message below, and it shows as coming from com.apple.photos.backend (PhotoKit) Missing prefetched properties for PHAssetAdjustmentProperties on <PHAsset: 0x160b1ec00> BCF5688F-F7A7-4196-AFC7-A84E8BD95F3E/L0/001 mediaType=1/0, sourceType=1, (5601x3734), creationDate=2022-01-24 23:36:05 +0000, location=0, hidden=0, favorite=0, adjusted=0 . Fetching on demand on the main queue, which may degrade performance. In particular, the message says 'Fetching on demand on the main queue' but I'm not sure if that means that PhotoKit will fetch on main queue or if that mean that our application is requesting the data on main queue. Anyone could clarify? thanks
1
0
741
Dec ’24
Does CMIO support "hide" build-in camera
Hi guys, Can I use CMIO to achieve the following feature on macOS when a USB device (Camera/Mic/Speaker) is connected: When a third-party video conferencing app is not in a meeting, ensure the app defaults to using the USB device (Camera/Mic/Speaker). When a third-party conferencing app is in a meeting, ensure the app automatically switches to the USB device (Camera/Mic/Speaker).
2
0
133
Jul ’25
Camera Calibration Data
To get calibration data during video recording, I use AVCaptureDepthDataOutput together with AVCaptureVideoDataOutput, synchronized via AVCaptureDataOutputSynchronizer (using the dataOutputSynchronizer method of CameraController in the example). The issue is that AVCaptureDepthDataOutput can only be used with .builtInLiDARDepthCamera, meaning it isn’t available for devices without LiDAR. Is it possible to obtain calibration data during video recording on devices without LiDAR, such as with .builtInWideAngleCamera? Can lensDistortionLookupTable and lensDistortionCenter be used to undistort the sampleBuffer I receive from AVCaptureVideoDataOutput?
2
0
603
Nov ’24
Can you add pictures with the camera using the new photos picker instead of the old UI View Controller?
I'm a new app developer and am trying to add a button that adds pictures from the photo library AND camera. I added the first function (adding pictures from the photo library) using the new-ish photoPicker, but I can't find a way to do the same thing for the camera. Should I just tough it out and use the UI View Controller struct that I've seen in all of the YouTube tutorials I've come across? I also want the user to be able to crop the picture in the app after they take a picture. Thanks in advance
1
0
599
Dec ’24
photo album widget won’t work
for a while i had one photo widget (no special app, just the standard apple one) and it was set to shuffle to an album of pics of my bf. no problems at all. a few weeks later i added one to shuffle through an album of pics of my cat, and that one worked fine, but it made the one of my bf stop working, and it just showed a blank white widget, no error message or anything. so i removed the one of my cat hoping the one of my bf would go back to working, and it didn’t. i only have the widgets for find my, my bank, and then apps on my home screen otherwise.
1
0
632
Dec ’24
WorldCaptureKit
Does the library exists in xCode 16.4? "import WorldCaptureKit" gives error "No such module 'WorldCaptureKit'". And I do not find any information about the library in the apple documentation. But AI keeps suggesting me to use the library
2
0
109
Jun ’25
Custom Image Filters
I’m building a camera app using SwiftUI and UIKit (with UIViewControllerRepsrwsentable). My app already is able to capture photos, but I also want to implement the important feature - apply my custom image filter to the image for live preview in camera and when this image is saving to the photo library (like in the default Apple camera app with Photographic styles). My image filter must be pretty advanced because I’m a photographer and I trying to achieve the same colours as I have with my custom image preset in Lightroom. I want to control the image parameters such as basic (exposure, contrast, shadows, etc.), tone curves for each channel (Red, Green, Blue channels separately), HSL (for Red, Orange, Yellow, Green, Blue, Aqua, Purple and Magenta), apply colour grading and more. Currently I’m straggling with implementation of this. I tried to create a custom image filter using Metal (it works with saturation) but I’m not sure if it is the best approach. I need help and recommendations of how developers implement this complex thing in their apps (what technologies should I use and etc.)
2
0
781
Dec ’24
Extrinsic matrix
Hi everyone, I am working on a 3D reconstruction project. Recently I have been able to retrieve the intrinsics from the two cameras on the back of my iPhone. One consideration is that I want this app to run regardless if there is no LiDAR, but at least two cameras on the back. IF there is a LiDAR that is something I have considered to work later on the course of the project. I am using a AVCaptureSession with the two cameras AVCaptureDevice: builtInWideAngleCamera builtInUltraWideCamera The intrinsic matrices seem to be correct. However, the when I retrieve the extrinsics, e.g., builtInWideAngleCamera w.r.t. builtInUltraWideCamera the matrix I get looks like this: Extrinsic Matrix (Ultra-Wide to Wide): [0.9999968, 0.0008149305, -0.0023960583, 0.0] [-0.0008256607, 0.9999896, -0.0044807075, 0.0] [0.002392382, 0.0044826716, 0.99998707, 0.0]. [-14.277955, -8.135408e-10, -0.3359985, 0.0] The extrinsic matrix of the form: [R | t], seems to be correct for the rotational part, but the translational vector is ALL ZEROS. Which suggests that the cameras are physically overlapped as well the last element not being 1 (homogeneous coordinates). Has anyone encountered this 'issue' before? Is there a flaw in my reasoning or something I might be missing? Any comments are very much appreciated.
1
0
603
Dec ’24
PHAssetChangeRequest revertAssetContentToOriginal without original asset content
The documentation for PHAssetChangeRequest.revertAssetContentToOriginal says it will fail if the original asset content is not on the current device so you should use PHAssetResourceManager to download it first, but this no longer seems to be the case in the latest iOS versions because an error no longer occurs when I take a photo on my iPhone, edit it, open Photos on my iPad and let it sync, then open my app on iPad and call revertAssetContentToOriginal for that asset. Does the system now take care of downloading the original when needed?
1
0
110
Jun ’25
Is bluetooth available in a "locked screen" camera capture extension?
I created a locked camera capture extension as explained in Apple's documentation. I'm trying to explore the possibilities of using a bluetooth peripheral from that extension - anybody knows if this is possible? The CBCentralManagerDelegate reports .unsupported in func centralManagerDidUpdateState, even if I have provided all the permissions in Info.plist.
1
0
522
Nov ’24