Photos & Camera

RSS for tag

Explore technical aspects of capturing high-quality photos and videos, including exposure control, focus modes, and RAW capture options.

Posts under Photos & Camera subtopic

Post

Replies

Boosts

Views

Activity

"Terminated due to signal 9" when launching app from camera button (iPhone 16 Pro)
Hello, I have followed the Creating a camera experience for the Lock Screen guide, and can now launch my app using the iPhone 16's new camera button. That said, after about 10 seconds the app is force-closed by the OS, with the only message appearing in the console: "Terminated due to signal 9". This error does not happen when: launching the app via physical camera button when device is locked launching the app by tapping the icon on the Home Screen It is only happening when: launching the app via physical camera button from the Home Screen when device is unlocked Any ideas? Thank you!
2
1
614
Sep ’24
Flag to avoid "shared is unavailable in application extensions" error?
Hello, I am trying to get my camera app to launch from the Lock Screen, and see that calls to UIApplication.shared are not allowed. In my app, I have: UIApplication.shared.isIdleTimerDisabled = true Which is causing this compile time error: 'shared' is unavailable in application extensions for iOS: Use view controller based solutions where appropriate instead I do not believe there is a view controller based solution for this. Is there a flag I can wrap around the call so that the compiler knows it won't be used during an application extension? Thank you!
3
0
677
Sep ’24
iPhone 16 Pro Camera Preview freeze
Hi all, we are working on iOS application that includes the camera functionality. This week we have received a few customer complaints regarding the camera usage with iPhone 16/16 Pro, both of the customers said that they have an issue with the camera preview(when the camera is open) the camera preview is just freezer but any other functionally and UI works as expected. Moreover the issue happens only for back camera, the front camera works perfectly. We have tested it in context of iOS 18 with iPhone 14/15/15 Pro/15 Pro Max but all devices with iOS 18 works perfectly without any issues. So we assumed there was no issues with iOS 18 but some breaking changes with the new iPhone 16/16 pro cameras were introduced that caused this effect. Unfortunatly, currently we can't test directly usign the iPhone 16/16 Pro since we have't these devices. We are using SwiftUI framework and here the implementation of the camera preview: VideoPreviewLayer final class CameraPreviewView: UIView { var previewLayer: AVCaptureVideoPreviewLayer { guard let layer = layer as? AVCaptureVideoPreviewLayer else { fatalError("Layer expected is of type VideoPreviewLayer") } return layer } var session: AVCaptureSession? { get { return previewLayer.session } set { previewLayer.session = newValue } } override class var layerClass: AnyClass { AVCaptureVideoPreviewLayer.self } } UIKit -> SwiftUI struct CameraRecordingView: UIViewRepresentable { @ObservedObject var cameraManager: CameraManager func makeUIView(context: Context) -> CameraPreviewView { let previewView = CameraPreviewView() previewView.session = cameraManager.session /// AVCaptureSession previewView.previewLayer.videoGravity = .resizeAspectFill return previewView } func updateUIView(_ uiView: CameraPreviewView, context: Context) { } } Setup camera input private func saveInput(input: AVCaptureDevice) { /// Where input is AVCaptureDevice.default(.builtInWideAngleCamera, for: .video, position: .back) do { let cameraInput = try AVCaptureDeviceInput(device: input) if session.canAddInput(cameraInput) { session.addInput(cameraInput) /// session is AVCaptureSession } else { sendError(error: .cannotAddInput) status = .failed } } catch { if error.nsError.code == -11852 { sendError(error: .microphoneError) } else { sendError(error: .createCaptureInput(error)) } status = .failed } } Does anybody have similar issues with iPhone 16/16 Pro? We would appreciate any ideas of how to potentially resolve the issue.
1
0
944
Sep ’24
AVCaptureSessionControlsDelegate Not Being Called From Capture App
I am looking to learn more about the new Capture Button controls for iPhone 16, and am working to adapt the AVCam Sample Code to support the Capture Button. While I believe I've followed the guidance in the Enhancing your app experience with the Camera Control documentation, I'm finding that while my AVCaptureControl items seem to be added to the capture session, the Capture Button does not ever do anything, nor are any of the delegate methods called. After I configure my capture session per the setupSession() method, I'm calling a method I added, func configureCameraControls(device:AVCaptureDevice): func configureCameraControls(device: AVCaptureDevice) { guard captureSession.supportsControls else { assertionFailure("App does not support camera control.") return } // Set the controls delegate captureSession.setControlsDelegate(controlsDelegate, queue: sessionQueue) // Begin configuring the capture session. captureSession.beginConfiguration() // Remove previously configured controls, if any. for control in captureSession.controls { captureSession.removeControl(control) } // Add a zoom control let systemZoomSlider = AVCaptureSystemZoomSlider(device: device) { zoomFactor in // TODO } // Create a control to adjust the device's exposure bias. let systemBiasSlider = AVCaptureSystemExposureBiasSlider(device: device) // Add a custom slider let focusSlider = AVCaptureSlider("Focus", symbolName: "scope", in: 0...1) focusSlider.setActionQueue(sessionQueue) { focusValue in // TODO } // Iterate over the passed in controls. for control in [systemZoomSlider, systemBiasSlider, focusSlider] { // Add the control to the capture session if possible. if captureSession.canAddControl(control) { captureSession.addControl(control) } else { print("Unable to add control \(control).") } } // Commit the capture session configuration. captureSession.commitConfiguration() } I define the controls delegate like so: final class CaptureControlsDelegate: NSObject, AVCaptureSessionControlsDelegate { func sessionControlsDidBecomeActive(_ session: AVCaptureSession) { } func sessionControlsWillEnterFullscreenAppearance(_ session: AVCaptureSession) { } func sessionControlsWillExitFullscreenAppearance(_ session: AVCaptureSession) { } func sessionControlsDidBecomeInactive(_ session: AVCaptureSession) { } } Which I instantiate earlier on in my app's lifecycle and make available to the CaptureService actor. I'm not sure if this snippet can provide enough detail to gather some help, but I can't quite fathom why the camera/capture pipeline works, but I'm not getting any functionality from the Capture Button nor is the AVCaptureSessionControlsDelegate ever having its methods called.
3
0
663
Sep ’24
Issue with Sending Live Photos in Messages on iOS 18
Dear Apple Support, I’ve noticed an issue with the Messages app on iOS 18. When I try to send Live Photos, I select the Live Photo icon, but the photo is sent as a still image instead. Despite following the correct steps, the Live Photo feature doesn’t seem to be working properly. I would appreciate it if your team could look into this and resolve the issue in an upcoming update. Thank you for your support and continued innovation! Best regards, Erfan Nateghie
5
1
1.5k
Sep ’24
Is it worth using Accelerate to convert 16 bpc RGB to 8 bpc RGB
I am working on an image processing app that requires 8 bit per channel (bpc) images. Sometimes, input images are 16 bpc (e.g. sRGB IEC61966-2.1; extended range) The app already draws the input image in a CGContext producing an 8 bpc CGImage. This works fine if the input is 16 bpc and I get an 8 bpc image. I am wondering if it would be better for image quality to convert the 16 bpc images to 8 bpc using Accelerate before that CGContext draw? or does that draw essentially do the equivalent?
0
0
306
Sep ’24
Challenges with Remotely Controlling iPhone Camera from Mac: Need Guidance
Hello everyone, I am working on an iOS app that involves capturing images automatically, and I would like to control the start/stop of the capture process remotely from a Mac app. I explored the iPhone Mirroring feature, which allows some remote control but has the limitation of only functioning when the iPhone is locked, and it doesn’t permit access to the iPhone’s camera from the Mac. Ideally, I am looking for a solution that would allow me to: Remotely control the camera capture process on the iOS app from the Mac app. Ensure the iPhone’s camera remains fully operational and controllable from the Mac during the capture process. I have considered using options like Handoff for communication between the apps but faced some issues while communicating between the iOS and mac app. I would like to know if there is a more optimal solution within Apple’s ecosystem, or if there are APIs I might have overlooked. Any advice or guidance on how to achieve this functionality would be greatly appreciated! Thanks in advance!
1
0
570
Sep ’24
session.openApplication() -- how to pass data from the extension to the application?
Hello, I apologize if the answer is obvious but I'm having a hard time figuring this one out. Let's say the user taps an "Edit" button in my LockedCameraCaptureSession. The extension calls: activity.userInfo = ["ActivityKey": "ID"] try await session.openApplication(for: activity) Can I retrieve, in my application, the data stored in activity.userInfo (lets say, a flag "open editor"), or is data passing exclusively handled via appContext of CameraCaptureIntent? Thank you!
3
0
419
Oct ’24
dyld[434]: Library not loaded: error when running LockedCameraCapture compatible app on iOS 15
Hello, I am getting the following error while attempting to run my LockedCameraCapture compatible app on an iOS 15 device: dyld[434]: Library not loaded: '/System/Library/Frameworks/LockedCameraCapture.framework/LockedCameraCapture' Referenced from: '/private/var/containers/Bundle/Application/.../MyApp.app/MyApp.debug.dylib' Reason: tried: '/System/Library/Frameworks/LockedCameraCapture.framework/LockedCameraCapture' (no such file) Of course iOS 15 doesn't have the library for LockedCameraCapture, but I have had no issue including Lock Screen Widgets (which require iOS 16), so I am not sure why the error is popping up. Thank you!
2
0
463
Oct ’24
'You don’t have permission. - The AVPlayerItem instance has failed with the error code 257 and domain "NSCocoaErrorDomain".'
[[PHImageManager defaultManager] requestAVAssetForVideo:asset options:videoOptions resultHandler:^(AVAsset *_Nullable avAsset, AVAudioMix *_Nullable audioMix, NSDictionary *_Nullable info) { if ([avAsset isKindOfClass:[AVURLAsset class]]) { AVURLAsset *urlAsset = (AVURLAsset *)avAsset; NSURL *videoURL = urlAsset.URL; mediaInfo[@"path"] = videoURL.absoluteString; } else { // Failed to get video asset completion(nil); } }];``` Before iOS 18, i could able access AVAsset video using the method mentioned above with the url, but starting from the iOS 18 version, the following error appears 'You don’t have permission. - The AVPlayerItem instance has failed with the error code 257 and domain "NSCocoaErrorDomain".'
2
0
704
Oct ’24
PHPickerViewControllerDelegate didCancel
Hello. In my app I have selection of photos and videos and also selection of PDF So I use PHPickerViewController for picking photos and videos and UIDocumentPickerViewController for picking documents. I found out that there's not documentPickerWasCancelled in PHPickerViewController delegate. So when a user presses Cancel, delegate's picker function fires the dialog is dismissed and system return a selected value of nil. But when I swipe the dialog down no event is generated so in my app I can't understand whether a user selected a photo or cancelled the dialog. In UIDocumentPickerViewController there's no such problems as it have didCancel as a separate funciton Is there any way to bypass this this?
1
0
377
Oct ’24
What format for writeHEIFRepresentation preserves HDR?
In the WWDC 24 session "Use HDR for dynamic image experiences in your app" it's noted this is how you save edits for Adaptive HDR: SDR + HDR: writeHEIFRepresentation(of: sdrImage, to: url, colorSpace: p3Space, options: [.hdrImage: hdrImage]) SDR + Gain: writeHEIFRepresentation(of: sdrImage, to: url, colorSpace: p3Space, options: [.hdrGainMapImage: gainImage]) This won't compile because the format argument is missing. What format should be used? In the WWDC 23 session "Support HDR images in your app" RGBAf, RGBAh, and RGBA16, and RGB10 were mentioned but I'm not sure which one to use. If relevant, I'm editing photos from the user's photo library, so the image was probably taken on iPhone but perhaps not. Thanks!
1
0
672
Oct ’24
Get View Full HDR state from Settings > Photos to properly set preferredImageDynamicRange in editing extension
I'm updating my Photo Editing Extension to support HDR. To do this I set imageView.preferredImageDynamicRange = .high. But you can turn off the option to view HDR photos in the complete dynamic range in Settings > Photos. When you do that, open a photo, and tap the edit button, it does not appear in the full range as expected, but when you select my app from More > Extensions, it does appear in the complete dynamic range unexpectedly. I need to set imageView.preferredImageDynamicRange = .standard when View Full HDR is off, but I don't see any way to get that in my PHContentEditingController.
1
0
619
Oct ’24
Compatibility Between ARKit and Optical Zoom
Hello, I am a developer currently working on an AR application using ARKit. I aim to implement a Zoom feature that allows users to enlarge and reduce objects within the AR scene while simultaneously measuring the distance to those objects. Specifically, I want to incorporate Optical Zoom to provide a more natural and precise user experience. I have considered several approaches and would appreciate your advice on the most effective methods. Approaches Being Considered: Using UIPinchGestureRecognizer to Adjust the Camera's Field of View Modifying the scale Property of SCNNode to Enlarge/Reduce Specific Objects Leveraging AVFoundation to Control the Camera's Optical Zoom Questions: Compatibility Between ARKit and Optical Zoom: Is it feasible to control the camera's optical zoom using AVFoundation while utilizing ARKit's features? What should be considered when integrating these two frameworks? Integrating Object Distance Measurement with Zoom Functionality: What is the most effective approach to measure and display the distance to an object in real-time when a user zooms in on it? User Experience Considerations: Do you have any UI/UX design tips for implementing optical zoom to ensure a natural and intuitive experience? For example, how can visual feedback for zoom actions and distance measurements be effectively presented to users? Performance Optimization: What optimization strategies can minimize potential performance issues when implementing both optical zoom and distance measurement features simultaneously? Example Code and Reference Materials: Could you share any example code or reference materials that demonstrate similar functionalities? Thank you. Example Code Request: If possible, providing sample code that integrates optical zoom with distance measurement would be extremely helpful. Reference Links: Please share any tutorials or resources that demonstrate the combined use of ARKit and AVFoundation.
1
0
528
Oct ’24