Photos & Camera

RSS for tag

Explore technical aspects of capturing high-quality photos and videos, including exposure control, focus modes, and RAW capture options.

Posts under Photos & Camera subtopic

Post

Replies

Boosts

Views

Activity

dyld[434]: Library not loaded: error when running LockedCameraCapture compatible app on iOS 15
Hello, I am getting the following error while attempting to run my LockedCameraCapture compatible app on an iOS 15 device: dyld[434]: Library not loaded: '/System/Library/Frameworks/LockedCameraCapture.framework/LockedCameraCapture' Referenced from: '/private/var/containers/Bundle/Application/.../MyApp.app/MyApp.debug.dylib' Reason: tried: '/System/Library/Frameworks/LockedCameraCapture.framework/LockedCameraCapture' (no such file) Of course iOS 15 doesn't have the library for LockedCameraCapture, but I have had no issue including Lock Screen Widgets (which require iOS 16), so I am not sure why the error is popping up. Thank you!
2
0
485
Mar ’25
com.Metal.CompletionQueueDispatch crash in Swift 6
I have a photo editing app which uses a simple Metal Render to display CIFilter output images. It works just fine in Swift 5 but in Swift 6 it crashes on starting the Metal command buffer with an error in the Queue : com.Metal.CompletionQueueDispatch (serial). The crash is occurring before I can debug.. I changed the command buffer to report MTLCommandBufferDescriptorStatus errorOptions = .encoderExecutionStatus. No luck with getting insight into the source of the crash.. Likewise the error is happening before any of the usual Metal debug tools are enabled. The Metal render works just fine in Swift 5 and also works fine with almost all of the Swift Compiler Upcoming feature flags set to Yes. [The "Default Internal Imports" flag is still No. (the number of compile errors with this setting is absolutely scary! but that's another topic) Do you have any suggestions on debugging or ideas on why the Metal library is crashing in Swift 6??? Everything is current release versions and hardware.
1
0
639
Dec ’24
Custom Image Filters
I’m building a camera app using SwiftUI and UIKit (with UIViewControllerRepsrwsentable). My app already is able to capture photos, but I also want to implement the important feature - apply my custom image filter to the image for live preview in camera and when this image is saving to the photo library (like in the default Apple camera app with Photographic styles). My image filter must be pretty advanced because I’m a photographer and I trying to achieve the same colours as I have with my custom image preset in Lightroom. I want to control the image parameters such as basic (exposure, contrast, shadows, etc.), tone curves for each channel (Red, Green, Blue channels separately), HSL (for Red, Orange, Yellow, Green, Blue, Aqua, Purple and Magenta), apply colour grading and more. Currently I’m straggling with implementation of this. I tried to create a custom image filter using Metal (it works with saturation) but I’m not sure if it is the best approach. I need help and recommendations of how developers implement this complex thing in their apps (what technologies should I use and etc.)
2
0
818
Dec ’24
Extrinsic matrix
Hi everyone, I am working on a 3D reconstruction project. Recently I have been able to retrieve the intrinsics from the two cameras on the back of my iPhone. One consideration is that I want this app to run regardless if there is no LiDAR, but at least two cameras on the back. IF there is a LiDAR that is something I have considered to work later on the course of the project. I am using a AVCaptureSession with the two cameras AVCaptureDevice: builtInWideAngleCamera builtInUltraWideCamera The intrinsic matrices seem to be correct. However, the when I retrieve the extrinsics, e.g., builtInWideAngleCamera w.r.t. builtInUltraWideCamera the matrix I get looks like this: Extrinsic Matrix (Ultra-Wide to Wide): [0.9999968, 0.0008149305, -0.0023960583, 0.0] [-0.0008256607, 0.9999896, -0.0044807075, 0.0] [0.002392382, 0.0044826716, 0.99998707, 0.0]. [-14.277955, -8.135408e-10, -0.3359985, 0.0] The extrinsic matrix of the form: [R | t], seems to be correct for the rotational part, but the translational vector is ALL ZEROS. Which suggests that the cameras are physically overlapped as well the last element not being 1 (homogeneous coordinates). Has anyone encountered this 'issue' before? Is there a flaw in my reasoning or something I might be missing? Any comments are very much appreciated.
1
0
619
Dec ’24
Message from com.apple.photos.backend (PhotoKit) in log
Hello Our application is backing up the user photos to some back end. When retrieving the asset data from the Photo Library, we set the flag 'accessNetworkAllowed' to true to get the assets that might be optimized in iCloud. In the application logs, we can see the message below, and it shows as coming from com.apple.photos.backend (PhotoKit) Missing prefetched properties for PHAssetAdjustmentProperties on <PHAsset: 0x160b1ec00> BCF5688F-F7A7-4196-AFC7-A84E8BD95F3E/L0/001 mediaType=1/0, sourceType=1, (5601x3734), creationDate=2022-01-24 23:36:05 +0000, location=0, hidden=0, favorite=0, adjusted=0 . Fetching on demand on the main queue, which may degrade performance. In particular, the message says 'Fetching on demand on the main queue' but I'm not sure if that means that PhotoKit will fetch on main queue or if that mean that our application is requesting the data on main queue. Anyone could clarify? thanks
1
0
776
Dec ’24
Can you add pictures with the camera using the new photos picker instead of the old UI View Controller?
I'm a new app developer and am trying to add a button that adds pictures from the photo library AND camera. I added the first function (adding pictures from the photo library) using the new-ish photoPicker, but I can't find a way to do the same thing for the camera. Should I just tough it out and use the UI View Controller struct that I've seen in all of the YouTube tutorials I've come across? I also want the user to be able to crop the picture in the app after they take a picture. Thanks in advance
1
0
622
Dec ’24
photo album widget won’t work
for a while i had one photo widget (no special app, just the standard apple one) and it was set to shuffle to an album of pics of my bf. no problems at all. a few weeks later i added one to shuffle through an album of pics of my cat, and that one worked fine, but it made the one of my bf stop working, and it just showed a blank white widget, no error message or anything. so i removed the one of my cat hoping the one of my bf would go back to working, and it didn’t. i only have the widgets for find my, my bank, and then apps on my home screen otherwise.
1
0
642
Dec ’24
Easy way to output live
Hey everyone😊, I am building an app that includes a live camera feed preview. That's all I need to do along side identifying the images with createML's image classification. I don't need to capture images at all. I've seen some very complicated tutorials. I just want to use a couple of lines of code.
6
0
645
Jan ’25
Compatibility Issue Between 360 Cameras and iPad Pro M4
Hello everyone, I need some help about this things. If you also know, pls comment. Overview We are planning to develop an app using the “Support external cameras in your iPadOS app” feature introduced in iPadOS 17. Before implementing this feature, it is necessary for the iPad to recognize external cameras. However, among the iPad models compatible with iPadOS 17, we have found that some of the iPads owned by our development team can recognize external cameras, while others cannot. If you have any reports regarding compatibility issues or information on how to resolve these problems, please share them with us. Detailed Explanation: The results of our investigation are as follows: External Camera Used: A 360-degree camera Devices Firmware RICOH Theta X 2.61.0(2024/12/26Latest) RICOH Theta Z1 Tested iPad Devices Firmware Status 12.9インチiPad Pro(第3世代) IOS 17.5.1 OK 11インチiPad Pro(M4) IOS 18.2 NG Verification Method Step 1: Power on the iPad and the external camera, ensuring both are ready for connection. Step 2: Connect the iPad and the external camera using a USB-C cable. Step 3: Launch FaceTime on the iPad and check the displayed camera feed. If the external camera is recognized, the feed from the external camera will be displayed.
1
0
500
Jan ’25
Is there a way to filter PHPickerViewController by the creation date of the assets?
Our app filters the photo library to a certain date range for ease of picking photos. However, to do this, we have to require full permissions to the photo library. We would like to use the PHPickerViewController and have it filter the results by the assets creation date? This would allow us to use it. I see other filter options, but not this one. And if it isn't there, is this something that is being thought about or on a roadmap?
1
0
555
Jan ’25
How to access HDRGainMap from AVCapturePhoto
Hey, I'm building a camera app and I want to use the captured HDRGainMap along side the photo to do some processing with a CIFilter chain. How can this be done? I can't find any documentation any where on this, only on how to access the HDRGainMap from an existing HEIC file, which I have done successfully. For this I'm doing something like the following: let gainmap = CGImageSourceCopyAuxiliaryDataInfoAtIndex(source, 0, kCGImageAuxiliaryDataTypeHDRGainMap) let gainDict = NSDictionary(dictionary: gainmap) let gainData = gainDict[kCGImageAuxiliaryDataInfoData] as? Data let gainDescription = gainDict[kCGImageAuxiliaryDataInfoDataDescription] let gainMeta = gainDict[kCGImageAuxiliaryDataInfoMetadata] However I'm not sure what the approach is with a AVCapturePhoto output from a AVCaptureDevice. Thanks!
2
0
680
Jan ’25
PIP Camera in iOS App
I am developing an iOS app with video call functionality and implementing Picture in Picture (PiP) mode for video calls. The issue I am facing is that the camera stops capturing video when the app goes to the background, even though the PiP view is still visible. I have noticed that some apps, like Telegram, manage to keep the camera working in PiP mode while the app is in the background. How can I achieve this in my app?
1
0
576
Jan ’25
LockedCameraCapture with ARKit based App
Hello, I'm building a camera app around ARKit. I've created a Lockscreen Capture Extension and added a control to initiate my camera app, but when I launch the extension I see just a black screen with no hints at any errors. Also attaching the debugger to the running process shows no logs. Im wondering: Is LockedCameraCapture supported with ARView and ARSession? ARKit was featured in a WWDC video with a camera app use-case, also the introduction of captureHighResolutionFrame(completion:) made me pick it up as an interesting camera app backbone - but if lockscreen capture is not possible with it I have to refactor my codebase.
1
0
503
Jan ’25
Improving the transition between ultra wide and wide angle lens
I'm building an app which uses the camera and want to take advantage of the ability of the builtInTripleCamera and builtInDualWideCamera to automatically switch between the ultra wide and wide angle lens to focus on close up shots. It's working fine - except that the transition between the two lenses is a bit jumpy. I looked at what the native Camera app does and it seems to apply a small amount of blurring when the transition happens to help "mask" the jumpiness. How can I replicate this, or is there another way to improve the UX of switching between one lens and another automatically?
1
0
470
Feb ’25