Photos & Camera

RSS for tag

Explore technical aspects of capturing high-quality photos and videos, including exposure control, focus modes, and RAW capture options.

Post

Replies

Boosts

Views

Activity

Get cameraCalibrationData with .builtInDualWideCamera when isGeometricDistortionCorrectionEnabled = True
Hi there, I am building a camera application to be able to capture an image with the wide and ultra wide cameras simultaneously (or as close as possible) with the intrinsics and extrinsics for each camera also delivered. We are able to achieve this with an AVCaptureMultiCamSession and AVCaptureVideoDataOutput, setting up the .builtInWideAngleCamera and .builtInUltraWideCamera manually. Doing this, we are able to enable the delivery of the intrinsics via the AVCaptureConnection of the cameras. Also, geometric distortion correction is enabled for the ultra camera (by default). However, we are seeing if it possible to move the application over to the .builtInDualWideCamera with AVCapturePhotoOutput and AVCaptureSession to simplify our application and get access to depth data. We are using the isVirtualDeviceConstituentPhotoDeliveryEnabled=true property to allow for simultaneous capture. Functionally, everything is working fine, except that when isGeometricDistortionCorrectionEnabled is not set to false, the photoOutput.isCameraCalibrationDataDeliverySupported returns false. From this thread and the docs, it appears that we cannot get the intrinsics when isGeometricDistortionCorrectionEnabled=true (only applicable to the ultra wide), unless we use a AVCaptureVideoDataOutput. Is there any way to get access to the intrinsics for the wide and ultra while enabling geometric distortion correction for the ultra? guard let captureDevice = AVCaptureDevice.default(.builtInDualWideCamera, for: .video, position: .back) else { throw InitError.error("Could not find builtInDualWideCamera") } self.captureDevice = captureDevice self.videoDeviceInput = try AVCaptureDeviceInput(device: captureDevice) self.photoOutput = AVCapturePhotoOutput() self.captureSession = AVCaptureSession() self.captureSession.beginConfiguration() captureSession.sessionPreset = AVCaptureSession.Preset.hd1920x1080 captureSession.addInput(self.videoDeviceInput) captureSession.addOutput(self.photoOutput) try captureDevice.lockForConfiguration() captureDevice.isGeometricDistortionCorrectionEnabled = false // <- NB line captureDevice.unlockForConfiguration() /// configure photoOutput guard self.photoOutput.isVirtualDeviceConstituentPhotoDeliverySupported else { throw InitError.error("Dual photo delivery is not supported") } self.photoOutput.isVirtualDeviceConstituentPhotoDeliveryEnabled = true print("isCameraCalibrationDataDeliverySupported", self.photoOutput.isCameraCalibrationDataDeliverySupported) // false when distortion correction is enabled let videoOutput = AVCaptureVideoDataOutput() videoOutput.setSampleBufferDelegate(self, queue: DispatchQueue(label: "sample buffer delegate", attributes: [])) if captureSession.canAddOutput(videoOutput) { captureSession.addOutput(videoOutput) } self.videoPreviewLayer.setSessionWithNoConnection(self.captureSession) self.videoPreviewLayer.videoGravity = AVLayerVideoGravity.resizeAspect let cameraVideoPreviewLayerConnection = AVCaptureConnection(inputPort: self.videoDeviceInput.ports.first!, videoPreviewLayer: self.videoPreviewLayer); self.captureSession.addConnection(cameraVideoPreviewLayerConnection) self.captureSession.commitConfiguration() self.captureSession.startRunning()
1
0
590
Nov ’23
Request for Access to Apple Photos API
Hello, I'm currently investigating the possibility of accessing my photos stored on my iCloud via a dedicated API, in order to create a photo portfolio. However, after extensive research, I haven't found any documentation or public API allowing such access. I wonder if there are any future plans to make such an API available to third-party developers. I would be grateful if you could provide me with information regarding the possibility of accessing an API for Apple Photos or any other solution you might suggest. Thank you for your attention and assistance. Yours sincerely Owen
0
0
541
Nov ’23
API for taking panorama & stitching ios 14 style?
A few versions of iOS ago, the stitching algorithm for panoramas was updated, which produces results that in my opinion look less good for what I'm using the panoramas for. I was exploring developing a custom panorama app but couldn't find the API for taking panoramic photos, much less stitching them. Is there an API in AVFoundation or elsewhere to use for capturing a panoramic photo and stitching it?
1
0
553
Nov ’23
PHLivePhoto.request using custom targetSize parameter cause memory leaks
My app uses PHLivePhoto.request to generate live photos, but memory leaks if I use a custom targetSize. PHLivePhoto.request(withResourceFileURLs: [imageUrl, videoUrl], placeholderImage: nil, targetSize: targetSize, contentMode: .aspectFit) {[weak self] (livePhoto, info) in Change targetSize to CGSizeZero, problem resolved. PHLivePhoto.request(withResourceFileURLs: [imageUrl, videoUrl], placeholderImage: nil, targetSize: CGSizeZero, contentMode: .aspectFit) {[weak self] (livePhoto, info) in
2
0
544
Nov ’23
how to get the proraw image output with 1:1, 16:9
Now I use AVFoundation framework to get the photo output, but the image aspect ratio is 4:3. But according to the Camera app in the iPhone 13 Pro, it has server image aspect ratio: 4:3, 16:9 and 1:1 when take the proraw image. So how can I get the 1:1, 16:9 aspect ratio proraw image? After I do some research, I find that no matter you use which kinds of camera in the iPhone 11, 12, 13, 14, 15 or Pro, the result image is always 4:3, 1:1 and 16:9 come from the 4:3 cropping. If it is true, how can I crop the proraw file without any data lossing? My developer environment: iPhone 13 Pro iOS 16.7 xcode 14.3.1 This is the session configuration code for the camera device configuration. session.beginConfiguration() /* Do not create an AVCaptureMovieFileOutput when setting up the session because Live Photo is not supported when AVCaptureMovieFileOutput is added to the session. */ session.sessionPreset = .photo // Add video input. do { var defaultVideoDevice: AVCaptureDevice? if let backCameraDevice = AVCaptureDevice.default(.builtInWideAngleCamera, for: .video, position: .back) { // If a rear dual camera is not available, default to the rear wide angle camera. defaultVideoDevice = backCameraDevice } else if let frontCameraDevice = AVCaptureDevice.default(.builtInWideAngleCamera, for: .video, position: .front) { // If the rear wide angle camera isn't available, default to the front wide angle camera. defaultVideoDevice = frontCameraDevice } guard let videoDevice = defaultVideoDevice else { print("Default video device is unavailable.") setupResult = .configurationFailed session.commitConfiguration() return } let videoDeviceInput = try AVCaptureDeviceInput(device: videoDevice) if session.canAddInput(videoDeviceInput) { session.addInput(videoDeviceInput) self.videoDeviceInput = videoDeviceInput } else { print("Couldn't add video device input to the session.") setupResult = .configurationFailed session.commitConfiguration() return } } catch { print("Couldn't create video device input: \(error)") setupResult = .configurationFailed session.commitConfiguration() return } // check the lens list let camerasOptions = videoDeviceDiscoverySession.devices var availableCameras: [AVCaptureDevice.DeviceType] = [] if camerasOptions.isEmpty { print("no camera devices") } else { for camera in camerasOptions { if camera.deviceType == .builtInUltraWideCamera || camera.deviceType == .builtInWideAngleCamera || camera.deviceType == .builtInTelephotoCamera { if !availableCameras.contains(camera.deviceType) { availableCameras.append(camera.deviceType) } } } } DispatchQueue.main.async { self.lensList = availableCameras } // Add the photo output. if session.canAddOutput(photoOutput) { session.addOutput(photoOutput) photoOutput.isHighResolutionCaptureEnabled = true photoOutput.maxPhotoQualityPrioritization = .quality print(photoOutput.isAppleProRAWSupported) // Use the Apple ProRAW format when the environment supports it. photoOutput.isAppleProRAWEnabled = photoOutput.isAppleProRAWSupported DispatchQueue.main.async { self.isSupportAppleProRaw = self.photoOutput.isAppleProRAWSupported } } else { print("Could not add photo output to the session") setupResult = .configurationFailed session.commitConfiguration() return } session.commitConfiguration()
1
0
479
Oct ’23
AVCaptureMultiCamSession and unstable nominalFrameRate of videoAssetTracks
Why, when I am recording and mixing videos from two cameras simultaneously using AVMultiCamPiP app as I guide, the nominalframerates of the videoAssetTracks I am recording do not have a fixed value (e.g. 30) but floats between 20-30 when the active format I am loading on both AVCaptureDevices supports that (e.g. 'vide'/'420v' 1920x1080, { 1- 30 fps},....) and I set min and max frame duration at 30fps (1,30).
0
0
313
Oct ’23
iPhone 14 pro blurry image issue
We are developing an image classification iOS app where we use Tensor flow model. Prediction of Tensor flow model depends upon accuracy of image captured. We are facing issue with iPhone 14 pro versions. Images captured are blurry. As per this link, https://www.pcmag.com/news/apple-promises-fix-for-iphone-14-pro-camera-shake-bug Blurry issue is fixed by apple. But we are still facing this issue on iPhone 14 Pro versions. Our iOS version is above 16.2. We are also facing this issue on whatsApp (and other third party applications as well). Is there any official documentation on how to open camera on iPhone 14 pro programmatically? Note: When we use apple camera app this issue doesn’t exist. Is this hardware or software issue? Is this issue fixed on iPhone 15 version? Can you please guide us on this issue. Thanks Jay
0
0
324
Oct ’23
how to execute photoLibraryDidChange(_:) when the app is in background?
Goal is to get/save the captured photo (From default camera) immediately from my app when the app is in background. When I capture a photo with default camera, photoLibraryDidChange(_:) function do not execute that time. But when I reopen my app, that time this function executes and give the images, which were captured in that particular time. how to execute photoLibraryDidChange(_:) when app is in background?
1
0
382
Oct ’23
Accessing "From my mac" in PhotoKit
Is it possible to access "From my mac" photos/PHAssetCollection through PhotoKit in iOS? "From my mac" photos/videos are media synced from a mac where iCloud Photos are turned off on the iOS device, like what we did in the ole' days before iCloud Photos. I have set up an iOS device with "From my mac" albums present in Photos.app, but in my own app I don't seem to be able to access those collections/photos through PhotoKit using all the defined PHAssetCollectionTypes. Are these directly synced photos simply not available through PhotoKit and you would have to revert to the deprecated ALAssetLibrary?
4
0
613
Oct ’23
Crash in photolibraryd Process During Import in Photos.app & PHAsset.fetchAssets() Call
Problem: While calling PHAsset.fetchAssets() and iterating over its results, if the Photos.app is simultaneously running an import operation (File | Import), the photolibraryd process crashes. I have already flagged this issue to Apple (FB13178379) but wanted to check if anyone else has encountered this behavior. Steps to Reproduce: Initiate an import in the Photos.app. Run the following code snippet in the Playground: import Photos PHPhotoLibrary.requestAuthorization(for: .readWrite) { authorizationStatus in guard authorizationStatus == .authorized else { return } let fetchResult = PHAsset.fetchAssets(with: nil) print(fetchResult) for i in 0..<fetchResult.count { print(fetchResult[i]) } } Upon doing this, I consistently receive the error: Connection to assetsd was interrupted - photolibraryd exited, died, or closed the photo library in the Console, causing my code to hang. Environment: macOS Version: 13.5.2 (22G91) Xcode Version: Version 14.3.1 (14E300c) Additional Info: After the crash, a report pops up in the Console, and typically, the Photos.app import operation freezes too. I've noticed that after terminating all processes and/or rebooting, the Photos.app displays "Restoring from iCloud…" and the recovery process lasts overnight. Seeking Suggestions: I'm exploring potential workarounds for this issue. I've attempted to use fetchLimit to obtain assets in batches, but the problem persists. Is there a method to detect that the Photos.app is executing an import, allowing my process to wait until completion? Alternatively, can I catch the photolibraryd crash and delay until it's restored? I'm operating in a batch processing mode for photos, so pausing and retrying later in the event of a Photos.app import isn't an issue. Any guidance or shared experiences would be greatly appreciated! Cheers and happy coding!
1
0
526
Sep ’23
AVFoundation low-light mode to match native camera's
I'm trying to get AVFoundation to provide the same low-light boost provided in the native camera app. I have tried: isLowLightBoostSupported (which always returns false, regardless of my configuration) setExposureTargetBias set to max AVCaptureSessionPreset set to various options without success Is automaticallyEnablesLowLightBoostWhenAvailable outdated? The documentation claims "capture device that supports this feature may only engage boost mode for certain source formats or resolutions", but with no explanation of which formats or resolutions. I can't tell if the option is no longer supported, or if there is some configuration permutation I haven't yet found.
1
0
397
Sep ’23
setExposureTargetBias handler problem
Hello everybody! I use video session and in delegate method didOutputSampleBuffer I get CMSampleBuffer. That's all right. But at some point, I want to change the exposure to -2, +2 and make images from the buffer. I use the setExposureTargetBias method and everything changes successfully, but there is one problem - I need to understand when exactly the buffer with the changed value (-2.0, +2.0 etc.) will be. The description says what to use CMTime syncTime from the setExposureTargetBias closure, which will be the future timestamp of the buffer with applied changes, but here there is a complete mismatch - when I compare these syncTime timestamp with buffer from didOutputSampleBuffer, the exposure is still in the process of changing. How to determine that exactly this buffer (its time stamp) corresponds exactly to the time stamp at which exposure bias changes will already be applied?
0
0
312
Aug ’23
Understanding PHPickerConfiguration.AssetRepresentationMode.current
The documentation for this API mentions: The system uses the current representation and avoids transcoding, if possible. What are the scenarios in which transcoding takes place? The reason for asking is that we've had a user reaching out saying they selected a video file from their Photos app, which resulted in a decrease in size from ~110MB to 35MB. We find it unlikely it's transcoding-related, but we want to gain more insights into the possible scenarios.
3
1
652
Aug ’23