Discuss using the camera on Apple devices.

Posts under Camera tag

170 Posts
Sort by:

Post

Replies

Boosts

Views

Activity

Info.plist
Hello, im trying to get my app to be able to ask the device for permission to have access to the camera. To do so i created an info.plist and turned off the generate Info.plist file in packaging. I then added what i believe are all the necessary keys. However, when i try to build and test on my phone i keep running an error that says that my app has a missing or invalid CFBundleExecutable in its info.plist. I tried to fix it by adding Key: CFBundleExecutable Type: String Value: $(EXECUTABLE_NAME) However, this isnt working. I have already added a BundleIdentifier using my com.name.appname the Bundle version string and Bundle Version. Now im not fully sure what to put to fix this issue. Is there another way to get the camera to work without having to create an info.plist? Or is this the only way?
0
0
51
3d
input type="file"でアップロードした画像のGPS情報
input type="file"でアップロードした画像データからGPS情報が除去されます。 こちらはiPadOS16.5.1で発生しておりました。 iPadOS17.4.1、iPadOS18.3では正常にGPS情報が保持されます。 iOS、iPadOSのアップデートでGPS情報を保持するよう修正されたと見受けられますが、リリースノートを参照しても上記修正についての記事を見つけられませんでした。 お手数ですが、上記修正に該当する記事をご教示頂けませんでしょうか。 どうぞよろしくお願い致します。
0
0
82
1w
Torch Freezes Ultra-Wide Camera When Switching Between Wide & Ultra-Wide Lenses (AVFoundation Bug?)
I'm developing an iOS app using AVFoundation for real-time video capture and object detection. While implementing torch functionality with camera switching (between Wide and Ultra-Wide lenses), I encountered a critical issue where the camera freezes when toggling the torch while the Ultra-Wide camera is active. Issue If the torch is ON and I switch from Wide to Ultra-Wide, the camera freezes If the Ultra-Wide camera is active and I try to turn the torch ON, the camera freezes The iPhone Camera app allows using the torch while recording video with the Ultra-Wide lens, so this should be possible via AVFoundation as well. Code snippet DispatchQueue.global(qos: .userInitiated).async { [weak self] in guard let self = self else { return } let isSwitchingToUltraWide = !self.isUsingFisheyeCamera let cameraType: AVCaptureDevice.DeviceType = isSwitchingToUltraWide ? .builtInUltraWideCamera : .builtInWideAngleCamera let cameraName = isSwitchingToUltraWide ? "Ultra Wide" : "Wide" guard let selectedCamera = AVCaptureDevice.default(cameraType, for: .video, position: .back) else { DispatchQueue.main.async { self.showAlert(title: "Camera Error", message: "\(cameraName) camera is not available on this device.") } return } do { let currentInput = self.videoCapture.captureSession.inputs.first as? AVCaptureDeviceInput self.videoCapture.captureSession.beginConfiguration() if isSwitchingToUltraWide && self.isFlashlightOn { self.forceEnableTorchThroughWide() } if let currentInput = currentInput { self.videoCapture.captureSession.removeInput(currentInput) } let videoInput = try AVCaptureDeviceInput(device: selectedCamera) self.videoCapture.captureSession.addInput(videoInput) self.videoCapture.captureSession.commitConfiguration() self.videoCapture.updateVideoOrientation() DispatchQueue.main.async { if let barButton = sender as? UIBarButtonItem { barButton.title = isSwitchingToUltraWide ? "Wide" : "Ultra Wide" barButton.tintColor = isSwitchingToUltraWide ? UIColor.systemGreen : UIColor.white } print("Switched to \(cameraName) camera.") } self.isUsingFisheyeCamera.toggle() } catch { DispatchQueue.main.async { self.showAlert(title: "Camera Error", message: "Failed to switch to \(cameraName) camera: \(error.localizedDescription)") } } } } Expected Behavior Torch should be able to work when Ultra-Wide is active, just like the iPhone Camera app does. The camera should not freeze when switching between Wide and Ultra-Wide with the torch ON. AVCaptureSession should not crash when toggling the torch while Ultra-Wide is active. Questions & Help Needed Is this a known issue with AVFoundation? How does the iPhone Camera app allow using the torch while recording in Ultra-Wide? What’s the correct way to switch between Wide and Ultra-Wide cameras without freezing when the torch is active? Info Device tested: iPhone 13 Pro / iPhone 15 Pro / Iphone 15 iOS Version: iOS 17.3 / iOS 18.0 Xcode Version: 16.2
0
0
140
1w
Camera USB
am new to using Swift for a Mac Application. I am trying to control an external UVC-compliant camera focus and other capabilities. However, I'm having trouble with this and don't know where to start. I have downloaded an application from the App Store and it can control the focus and other capabilities. I've tried IOKit but this seems to be complicated and this does not return any capabilities or control the camera. I also tried AVfoundation and was able to open the camera, but using the following code did not work for me. as a device.isFocusPointOfInterestSupported returns false and without checking the app crashes. @IBAction func focusChanged(_ sender: NSSlider) { do { guard let device = videoDevice else { return } try device.lockForConfiguration() // Check if focus mode and point of interest are supported if device.isFocusModeSupported(.locked) { device.focusMode = .locked } if device.isFocusPointOfInterestSupported { // Map the slider value (0.0 to 1.0) to the focus point's X coordinate let focusX = CGFloat(sender.doubleValue) let focusPoint = CGPoint(x: focusX, y: 0.5) // Y coordinate is typically 0.5 (centered vertically) device.focusPointOfInterest = focusPoint } else { print("Focus point of interest is not supported on this device.") } device.unlockForConfiguration() // Log focus settings print("Focus point: \(device.focusPointOfInterest)") print("Focus mode: \(device.focusMode.rawValue)") } catch { print("Error adjusting focus: \(error)") } Any help or advice is much appreciated.
0
0
161
2w
Use Metal to conver HDR Pixelbuffer to SDR Pixelbuffer
I see some demo show convert HDR video to SDR Pixelbuffer,such AVAssetReader、 AVVideoComposition 、AVComposition 、AVFoundation. But In some cases,I want to render HDR Pixelbuffer and record video. AVCaptureSession *session = [[AVCaptureSession alloc] init]; session.sessionPreset = AVCaptureSessionPresetHigh; AVCaptureDevice *videoDevice = [AVCaptureDevice defaultDeviceWithMediaType:AVMediaTypeVideo]; if ([videoDevice isVideoHDRSupported]) { NSError *error = nil; if ([videoDevice lockForConfiguration:&error]) { videoDevice.automaticallyAdjustsVideoHDREnabled = NO; videoDevice.videoHDREnabled = YES; // 开启 HDR [videoDevice unlockForConfiguration]; } else { NSLog(@"Error: %@", error.localizedDescription); } } Real-time processing of HDR data requires processing of video frame data (such as filters), ensuring that the processing chain supports 10-bit color depth and HDR metadata. And use imagesBuffer to object tracking, etc. How to solve this problem?
1
0
162
3w
How to share 'back facing' iOS camera app at same time Eye Tracking app needs 'front facing' camera?
While using my xmas present of a new iPhone and iOS 18.2, I figured I'd try the Eye Tracker app. I've been working with clients successfully using Tobii and other existing eye trackers. In my limited tests, Apple has room for improvement. My main issue is with the camera app which cannot be used at the same time while using the Eye Tracker app. I get an error popup from Apple: Camera is use by another app The image below is from my app showing the popup message "Camera in use by another app", but the same error occurs on the installed camera app. This error is from Apple, not my app. For terminology: 'front' camera is the one pointing at the user (the selfi camera) while 'back' camera is the main one with multiple lenses. Eye tracking needs the 'front' camera. It seems when an app uses the camera, it takes over both the front and back facing cameras (since you might swap them). Thus another app, especially Eye Tracking, cannot use just the front facing camera at the same time. That limits use of Eye Tracking, in particular one cannot take pictures or click any buttons on an app that uses the camera. Anyone know of a way for an app to not take over both front and back cameras at the same time? If I can separate them, the Eye Tracker could use the front camera while the camera uses the back camera.
1
0
169
3w
LockedCameraCapture with ARKit based App
Hello, I'm building a camera app around ARKit. I've created a Lockscreen Capture Extension and added a control to initiate my camera app, but when I launch the extension I see just a black screen with no hints at any errors. Also attaching the debugger to the running process shows no logs. Im wondering: Is LockedCameraCapture supported with ARView and ARSession? ARKit was featured in a WWDC video with a camera app use-case, also the introduction of captureHighResolutionFrame(completion:) made me pick it up as an interesting camera app backbone - but if lockscreen capture is not possible with it I have to refactor my codebase.
1
0
213
3w
Slow Auto Focus on iPhone 16 Pro with ARkit camera
I have recently started testing ARKit on an iPhone 16 Pro and I have noticed that the AutoFocus reaction on this device is much slower than other devices. For example, if I point the camera to a close object AutoFocus takes 4-5 seconds to stabilize, the focal length is adjusted very very slowly. In some cases (although this is rare) AutoFocus seems almost stuck and requires a bit of device movement to trigger. This is quite problematic when using some ARKit features like Image and Object detection as the detection algorithms struggle with out-of-focus images. This problem is limited to ARKit. AutoFocus is significantly more responsive when the standard AVFoundation Camera API is used. This behavior is easy to reproduce with any of the ARKit samples like https://developer.apple.com/documentation/arkit/arkit_in_ios/content_anchors/tracking_and_visualizing_planes Is anybody else experiencing this problem?
3
0
297
3w
Macro-mode in AVCaptureDevice(custom camera)
Hi, I would like to use macro-mode for the custom camera using AVCaptureDevice in my project. This feature might help to automatically adjust and switch between lenses to get a close up clear image. It looks like this feature is not available and there are no open apis to achieve macro mode from Apple. Is there a way to get this functionality in the custom camera without losing the image quality. Please let me know if this is possible. Thanks you, Adil Thamarasseri
7
1
391
1w
Combining ARKit Face Tracking with High-Resolution AVCapture and Perspective Rendering on Front Camera
Subject: Combining ARKit Face Tracking with High-Resolution AVCapture and Perspective Rendering on Front Camera Message: Hello Apple Developer Community, We’re developing an application using the front camera that requires both real-time ARKit face tracking/guidance and the capture of high-resolution still images via AVCaptureSession. Our goal is to leverage ARKit’s depth and face data to render a captured image from another perspective post-capture, maintaining high image quality. Our Approach: Real-Time ARKit Guidance: Utilize ARKit (e.g., ARFaceTrackingConfiguration) for continuous face tracking, depth, and scene understanding to guide the user in real time. High-Resolution Capture Transition: At the moment of capture, we plan to pause the ARKit session and switch to an AVCaptureSession to take a high-resolution image. We assume that for a front-facing image, the subject’s face is directly front-on, and the relative pose between the face and camera remains the same during the transition. The only variation we expect is a change in distance. Our intention is to minimize the delay between the last ARKit frame and the high-res capture to maintain temporal consistency, assuming that aside from distance, the face-camera relative pose remains unchanged. Post-Processing Perspective Rendering: Using the last ARKit face data (depth, pose, and landmarks) along with the high-resolution 2D image, we aim to render the scene from another perspective. We want to correct the perspective of the 2D image using SceneKit or RealityKit, leveraging the collected ARKit scene information to achieve a natural, high-quality rendering from a different viewpoint. The rendering should match the quality of a normally captured high-resolution image, adjusting for the difference in distance while using the stored ARKit data to correct perspective. Our Questions: Session Transition Best Practices: What are the recommended best practices to seamlessly pause ARKit and switch to a high-resolution AVCapture session on the front camera How can we minimize user movement or other issues during this brief transition, given our assumption that the face-camera pose remains largely consistent except for distance changes? Data Integration for Perspective Rendering: How can we effectively integrate stored ARKit face, depth, and pose data with the high-res image to perform accurate perspective correction or rendering from another viewpoint? Given that we assume the relative pose is constant except for distance, are there strategies or APIs to leverage this assumption for simplifying the perspective transformation? Perspective Correction with SceneKit/RealityKit: What techniques or workflows using SceneKit or RealityKit are recommended for correcting the perspective of a captured 2D image based on ARKit scene data? How can we use these frameworks to render the high-resolution image from an alternative perspective, while maintaining image quality and fidelity? 4. Pitfalls and Guidelines: What common pitfalls should we be aware of when combining ARKit tracking data with high-res capture and post-processing for perspective rendering? Are there performance considerations, recommended thresholds for acceptable temporal consistency, or validation techniques to ensure the ARKit data remains applicable at the moment of high-res capture? We appreciate any advice, sample code references, or documentation pointers that could assist us in implementing this workflow effectively. Thank you!
2
0
317
3w
PIP Camera in iOS App
I am developing an iOS app with video call functionality and implementing Picture in Picture (PiP) mode for video calls. The issue I am facing is that the camera stops capturing video when the app goes to the background, even though the PiP view is still visible. I have noticed that some apps, like Telegram, manage to keep the camera working in PiP mode while the app is in the background. How can I achieve this in my app?
1
0
231
Jan ’25
Debug View Hierarchy not showing AVCaptureVideoPreviewLayer
I have an iOS application view that contains an AVCaptureSession, AVCaptureVideoPreviewLayer (created with the AVCaptureSession), and a UIImageView (in the backend the app takes the output of the AVCaptureSession, runs it through a Semantic Segmentation model, and displays the output in the UIImageView). When I pause the app and run the “Debug View Hierarchy”, it shows the UIImageView, the relevant buttons and labels. However, it does not seem to show AVCaptureVideoPreviewLayer that I have set up in my application. Is there some special set up that needs to be done to be able to view Camera Related features? The following is part of the view code, a component that is used to render the AVCaptureVideoPreviewLayer (not sure if this is enough, please let me know if its not): class CameraViewController: UIViewController { var session: AVCaptureSession? var frameRect: CGRect = CGRect() var rootLayer: CALayer! = nil private var previewLayer: AVCaptureVideoPreviewLayer! = nil init(session: AVCaptureSession) { self.session = session super.init(nibName: nil, bundle: nil) } required init?(coder: NSCoder) { super.init(coder: coder) } override func viewDidLoad() { super.viewDidLoad() setUp(session: session!) } private func setUp(session: AVCaptureSession) { previewLayer = AVCaptureVideoPreviewLayer(session: session) previewLayer.videoGravity = AVLayerVideoGravity.resizeAspectFill previewLayer.frame = self.frameRect DispatchQueue.main.async { [weak self] in self!.view.layer.addSublayer(self!.previewLayer) //self!.view.layer.addSublayer(self!.detectionLayer) } } } struct HostedCameraViewController: UIViewControllerRepresentable{ var session: AVCaptureSession! var frameRect: CGRect func makeUIViewController(context: Context) -> CameraViewController { let viewController = CameraViewController(session: session) viewController.frameRect = frameRect return viewController } func updateUIViewController(_ uiView: CameraViewController, context: Context) { } }
3
0
281
Jan ’25
Camera's settings at calibration time
Hi everyone, I am wondering under which settings the camera(s) were set by the time they were calibrated. For instance, one aspect that is easy to find is the reference resolution of the images taken when calibrating the intrinsics, this is by retrieving intrinsicMatrixReferenceDimensions. Making sure that the principal point is referenced to the by the time resolution used when the calibration was ongoing. However, recently I saw that there are focusing modes that potentially displace the lens' physical position. Settings like: AutoFocusRangeRestriction: none, near, far setFocusModeLocked: Locks the lens position at the specified value, and sets the focus mode to a locked state. My concern lies the impact this focusing lens displacements have on the intrinsic matrix parameters, if the lens is displaced, these parameters no longer describe the camera since the lens position has changed w.r.t. the lensPosition set when they were calibrated [0-1]. If my understanding is correct the AutoFocusRangeRestriction is just a range freedom the system is allowed to auto-focus and not a specific lens position. Conversely, the setFocusModeLocked does indeed fix the lensPosition to a certain value [0 - 1]. In simple words, what is the focus lensPosition the cameras were set when calibrating them for intrnisics?
0
0
229
Jan ’25
Camera settings at intrinsic calibration time
Hi everyone, I am wondering under which settings the camera(s) were set by the time they were calibrated. For instance, one aspect that is easy to find is the reference resolution of the images taken when calibrating the intrinsics, this is by retrieving intrinsicMatrixReferenceDimensions. Making sure that the principal point is referenced to the by the time resolution used when the calibration was ongoing. However, recently I saw that there are focusing modes that potentially displace the lens' physical position. Settings like: AutoFocusRangeRestriction: none, near, far setFocusModeLocked: Locks the lens position at the specified value, and sets the focus mode to a locked state. My concern lies the impact this focusing lens displacements can have on the intrinsic matrix parameters, like these parameters no longer describe the camera since the lens position has changed. In simple words, what is the focus 'mode'/'range' the cameras were set when calibrating them for intrnisics?
0
0
280
Jan ’25
Can't swap camera with AVCaptureMultiCamSession
I have a PIP camera that is streaming from the front and back based on AVCaptureMultiCamSession. It works fine, but when i go to swap the camera it crashes. This is code that works with a single camera, so not sure what is wrong. Also the object appears valid in the debugger. This is the snippit where the camera is swapped private func updateSessionConfiguration() { guard isCaptureSessionConfigured else { return } captureSession.beginConfiguration() defer { captureSession.commitConfiguration() } // Remove all current inputs for input in captureSession.inputs { if let deviceInput = input as? AVCaptureDeviceInput { captureSession.removeInput(deviceInput) app_log("removing input for \(input)") } } // Add the primary device input if let deviceInput = deviceInputFor(device: captureDevice) { app_log("device input \(deviceInput)") if !captureSession.inputs.contains(deviceInput), captureSession.canAddInput(deviceInput) { captureSession.addInput(deviceInput) } } if let secondaryDeviceInput = deviceInputFor(device: secondaryCaptureDevice) { app_log("Secondary device input \(secondaryDeviceInput)") if !captureSession.inputs.contains(secondaryDeviceInput), captureSession.canAddInput(secondaryDeviceInput) { captureSession.addInput(secondaryDeviceInput) } } updateVideoOutputConnection() } It crashes at: captureSession.addInput(deviceInput) with: Thread 10: EXC_BAD_ACCESS (code=1, address=0xcaeb36b964f0) which is strange because canAdd is checked prior to this call. Totally stumped here. Please help. Not sure if this is an AVCaptureMuliCamSession issue or something.
4
0
257
Jan ’25
How to confidently select one type of camera on iOS
We have a web application that uses high resolution images to validate the authenticity of products. For this purpose we want to use the best camera to make the high resolution camera, on iPhone Pro devices this camera is the ultra-wide angle camera. The issue we have is how to confidently select that camera from the list returned by navigator.mediaDevices.enumerateDevices. We can't use the device ID as it change every time (and for every user), we could use the camera name but the string is translate to the device language which is very problematic. We could also just select a specific item in the list but we are not sure that the order is preserved and it makes it hard to deal with other iPhone models that don't have that ultra wide angle camera. Selecting a specific camera looks like an essential feature not only for us. What is the best option, we are looking for something that is future proof and easily scalable.
0
0
268
Dec ’24
Camera feed access issue from web content in Autofill extension
I am working on task to add WKWebView to Autofill extension. This web view presents web content that can access camera feed. As an example here is a simple html: I have added Camera permission entitlements to both main app and autofill extension Info.plist Camera feed is accessed properly from the main app. However, doing the same in the Autofill extension does not show Camera stream in the web content. I am receiving camera permissions alert and am allowing permissions. It just stucks on the black screen and in console I see these logs: 16000a00 - GPUProcessProxy::didClose: 0x116000a00 - GPUProcessProxy::gpuProcessExited: reason=Crash 0x1150180c0 - [PID=1 523] WebProcessProxy::gpuProcessExited: reason=Crash Error acquiring assertion: <Error Domain=RBSServiceErrorDomain Code=1 "target is not running or doesn't have entitlement com.apple.runningboard.assertions.webkit" UserInfo={NSLocalizedFailureReason=target is not running or doesn't have entitlement com.apple.runningboard.assertions.webkit}> 0x115020360 - ProcessAssertion::acquireSync Failed to acquire RBS assertion 'GPUProcess Background Assertion' for process with PID=1 524, error: Error Domain=RBSServiceErrorDomain Code=1 "target is not running or doesn't have entitlement com.apple.runningboard.assertions.webkit" UserInfo={NSLocalizedFailureReason=target is not running or doesn't have entitlement com.apple.runningboard.assertions.webkit} 0x1160012a0 - GPUProcessProxy::didClose: 0x1160012a0 - GPUProcessProxy::gpuProcessExited: reason=Crash 0x1150180c0 - [PID=1 523] WebProcessProxy::gpuProcessExited: reason=Crash Error acquiring assertion: <Error Domain=RBSServiceErrorDomain Code=1 "target is not running or doesn't have entitlement com.apple.runningboard.assertions.webkit" UserInfo={NSLocalizedFailureReason=target is not running or doesn't have entitlement com.apple.runningboard.assertions.webkit}> 0x115020300 - ProcessAssertion::acquireSync Failed to acquire RBS assertion 'GPUProcess Background Assertion' for process with PID=1 525, error: Error Domain=RBSServiceErrorDomain Code=1 "target is not running or doesn't have entitlement com.apple.runningboard.assertions.webkit" UserInfo={NSLocalizedFailureReason=target is not running or doesn't have entitlement com.apple.runningboard.assertions.webkit} Looks like WKWebView crashes. Here are my configurations for the WKWebView: let webConfiguration = WKWebViewConfiguration() webConfiguration.allowsInlineMediaPlayback = true webConfiguration.mediaTypesRequiringUserActionForPlayback = [] let webView = WKWebView(frame: .zero, configuration: webConfiguration) webView.navigationDelegate = self webView.uiDelegate = self webView.scrollView.isScrollEnabled = false webView.contentMode = .scaleAspectFit view.addSubview(webView) Does anyone know what might be the problem? Is it even possible to access Camera from web content in Autofill extension?
0
0
282
Dec ’24