Photos & Camera

RSS for tag

Explore technical aspects of capturing high-quality photos and videos, including exposure control, focus modes, and RAW capture options.

Posts under Photos & Camera subtopic

Post

Replies

Boosts

Views

Created

WWDC25 Camera & Photos group lab summary (Part 1 of 3)
(Note: this is part 1 of a 3 part posting. See Part 2 or Part 3) At WWDC25 we launched a new type of Lab event for the developer community - Group Labs. A Group Lab is a panel Q&A designed for a large audience of developers. Group Labs are a unique opportunity for the community to submit questions directly to a panel of Apple engineers and designers. Here are the highlights from the WWDC25 Group Lab for Camera & Photos. WWDC25 Camera & Photos group lab ran for one hour at 6 PM PST on Tuesday June 10th, 2025 Introductory kick-off questions Question 1 Tell us a little about the new AVFoundation Capture APIs we've made available in the new iOS 26 developer preview? Cinematic Capture API (strong/weak focus, tracking focus)(scene monitoring)(simulated aperture)(dog/cat heads/groupIDs) Camera Controls and AirPod Stem Clicks Spatial Audio and Studio Quality AirPod Mics in Camera Lens Smudge Detection Exposure and Focus Rect of Interest Question 2 I built QR code scanning into my app, but on newer iPhones I have to hold the phone very far away from the QR code, otherwise the image is blurry and it doesn't scan. Why is this happening and how can I fix it? Every year, the cameras get better and better as we push the state of the art on iPhone photography and videography. This sometimes results in changes to the characteristics of the lenses. min focus distance newer phones have multiple lenses automatic switching behavior Use virtual device like the builtInDualWide or built in Triple, rather than just the builtInWide Set the videoZoomFactor to 2. You're done. Question 3 Last year, we saw some exciting new APIs introduced in AVFoundation in the health space. With Constant Color photography, developers can take pictures that have constant color regardless of ambient lighting. There are some further advancements this year. Davide, could you tell us about them? constant color photography is mean to remove the "tone mapping" applied to photograph captured with camera app, usually incldsuing artistic intent, and instead try to be a close as possible to the real color of the scene, regardless of the illumination constant color images could be captured in HEIF anf JPEG laste year. this year we are adding Support for the DICOM medical imaging photo format. It is a fomrat used by the health industry to store images related to medical subjects like MRI, skin problems, xray and so on. It's writable and also readable format on all OS26, supported through AVCapturePhotoOutput APIs and through the coregraphics api. for coregrapphics there is a new DICOM entry in the property dictionary which includes all the dicom availbale and defined propertie in a file. finder will also display all those in the info panel (Address why a developer would want to use it) - not for regualr picture taking apps. for those HEIF and JPEG are the preferred delivery format. use dicom if your app produces output that are health related, that you can also share with health providers or your doctors Main session developer questions Question 1 LiDAR vs. Dual Camera depth generation: Which resolution does the LiDAR sensor natively have (iPhone 16 Pro) and when to prefer LiDAR over Dual Camera? Both report formats with output resolutions (we don't advertise sensor resolution) Lidar vs Dual, etc: Lidar: Best for absolute depth, real world scale and computer vision Dual, etc: relative, disparity-based, less power, photo effects Also see: 2022 WWDC session "Discovery advancements in iOS camera capture: Depth, focus and multitasking" Question 2 Can true depth and lidar camera run at 60fps? Lidar can do 30fps (edited) Front true depth can do 60fps. Question 3 What’s the first class way to use PhotoKit to reimplement a high performance photo grid? We’ve been using a LazyVGrid and the photos caching manager, but are never able to hit the holy trinity (60hz, efficient memory footprint, minimal flashes of placeholder/empty cells) use the PHCachingImageManager to get media content delivered before you need to display it specify the size you need for grid sized display set the options PHVideoRequestOptionsDeliveryModeFastFormat, PHImageRequestOptionsDeliveryModeFastFormat and PHImageRequestOptionsResizeModeFast Question 4 For rending live preview of video stream, Is there performance overhead from using async and Swift UI for image updates vs UIViewRepresentable + AVCaptureVideoPreviewLayer.self? AVCaptureVideoPreviewLayer is the most efficient display path Use VDO + AVSampleBufferDisplayLayer if you need to modify the image data Swift UI image is optimized for static image content Question 5 Is there a way to configure the AVFoundation BuiltInLiDarDepthCamera mode to provide a depth map as accurate as ARKit at close range? The AVCaptureDepthDataOutput supports filtering that reduces noise and fills in invalid values. Consider using this for smoother depth maps Question 6 Pyramid-based photo editing in core image (such as adobe camera raw highlights and shadows)? First off you may want to look a the builtin filter called CIHighlightShadowAdjust Also the noise reduction in the CIRawFilter uses a pyramid-based algorithm. You can also write your own pyramid-based algorithms by taking an input image: down sample it by two multiply times using imageByApplyingAffineTransform apply additional CIKernels to each downsampled image as needed. use a custom CIKernel to combine the results. Question 7 Is the best way to integrate an in-app camera for a “non-camera” app UIImagePickerController? Yes, UIImagePickerController provides system-provided UI for capturing photos and movies. Question 8 Hello, my question is on Deferred Photo Processing? Say I have a photo capture app that adds a CIFilter to the capture. How can I take advantage of Deferred Photo Processing? Since I don’t know how to detect when the deferred captured photo is ready CIFilter can be called on the final at that point Photo will have to be re-inserted into the Photo library as adjustment Question 9 For shipping photo style assets in the app that need transparency what is the best format to use? JPEG2000? will moving to this save a lot of space comapred to PNG or other options? If you want lossless compression PNG is good and supports unpremutiplied alpha If you want lossy compression HEIF supports premutiplied or unpremutiplied alpha (Note: this is part 1 of a 3 part posting. See Part 2 or Part 3)
0
0
407
Jul ’25
WWDC25 Camera & Photos group lab summary (Part 2 of 3)
(Note: this is part 2 of a 3 part posting. See Part 1 or Part 3) At WWDC25 we launched a new type of Lab event for the developer community - Group Labs. A Group Lab is a panel Q&A designed for a large audience of developers. Group Labs are a unique opportunity for the community to submit questions directly to a panel of Apple engineers and designers. Here are the highlights from the WWDC25 Group Lab for Camera & Photos. WWDC25 Camera & Photos group lab ran for one hour at 6 PM PST on Tuesday June 10th, 2025 Question 10 Can we directly integrate auto-capture triggers (e.g., when image is steady or text is detected) using Vision and AVFoundation? Yes apps can use AVCaptureSession's VDO + AVCapturePhotoOutput, run vision on VDO buffers and capture photo when certain scene or text is detected. Just to be careful to run Vision on VDO buffers async so it doesn't cause frame drops. Question 11 What Camera or Photos framework features support working with images from external media, like connected cameras or SD cards? Any best practices? The ImageCaptureCore framework supports camera devices, memory cards, scanners read and write, where supported check out the docs to see how to browse connected devices, folders, files, etc. Question 12 Hi Brad, to follow up on your SwiftUI cautionary note: using AVCaptureVideoPreview inside a UIViewRepresentable, is okay, right? Thanks all for the great info! Yes, this is totally fine. AppKit or UIKit views inside appropriate SwiftUI representables should be equivalent performance Question 13 What’s the “right” way to transition media in my photos app between HDR modes? When I’m in a one-up view, we use HDR, but in other contexts (like thumbnail) we don’t want HDR. Is there a nice way to tone map? There’s a suite of new System Tone Mapper APIs in this years’ OSes CoreImage ImageKit CoreAnimation, CoreGraphics For example: CoreImage: new CISystemToneMap filter. CoreAnimation: layer.preferredDynamicRange = CADynamicRangeConstrainedHigh Using image views (NSImageView/UIImageView/SwiftUI Image/CALayer) support animations on preferredDynamicRange Can go from high to constrained to standard Tone mapping is provided by the system (CISystemToneMap for controllable example) Question 14 What is your recommendation to preprocess and upscale your depth map in order to render a realistic portrait mode image? One way to do this: the CIEdgePreserveUpsample CIFilter can be use to upsample a lower resolution depth map by using a higher resolution RGB image as a guide. Question 15 For buffering frames for later processing from real-time camera output should we prefer a AVSampleBufferDisplayLayer centered approach or AVCaptureVideoDataOutputSampleBufferDelegate centered approach? When would we use each? AVSampleBufferDisplayLayer and AVCaptureVideoDataOutputSampleBufferDelegate are used hand in hand for custom camera preview. For buffering for later processing, ensure you make copies of VDO buffers to not drop frames from the output Question 16 Hello, my question is on Deferred Photo Processing? Say I have a photo capture app that adds a CIFilter to the capture. How can I take advantage of Deferred Photo Processing? Since I don’t know how to detect when the deferred captured photo is ready CIFilter can be called on the final at that point Photo will have to be re-inserted into the Photo library as adjustment Question 17 Is digital zoom (e.g., 1.5x) before taking a photo the same as cropping the photo afterward? digital zoom upscales the image to output dimensions and cropping will yield a smaller output image while digital zoom will crop, it also upscales Question 18 How do you design camera interfaces that work for both casual users and photography enthusiasts? Progressive disclosure: Put the most common controls up front, and make it easy for pros to drill down. Sensible Defaults: Choose defaults that work well for casual users, but allow those defaults to be modified for photography enthusiasts A good philosophy is: Keep the simple things easy, make the hard things possible Question 19 Recent iPhone models introduced macro mode which automatically switch between lenses to take into account of the focal distance difference. Is there official API to implement this, or should I implement them myself using LiDAR values. Using builtInTripleCamera and builtInDualWideCamera will automatically switch to macro when available Question 20 a couple of years ago at WWDC, the option of replacing a camera with a virtual camera was mentioned. How does one do that - make the “physical” camera effectively disappear, so only the virtual camera is accessible to the user? You can't prevent the built-in camera from being available to the user Question 21 Can developers now integrate custom Core ML models with Vision for on-device photo analysis more seamlessly? Yes they can, use CoreMLRequest , provide their model container Been supported for a while (iOS 18/macOS 15) For more details go to Machine Learning & AI group lab Thursday use smaller images for better performance Question 22 What would you recommend for capture of the new immersive and spatial formats? To capture Spatial Video use AVCaptureMovieFileOutput’s spatialVideoCaptureEnabled property Not all device formats support spatial capture, check AVCaptureDevice.activeFormat.spatialVideoCaptureSupported See WWDC 2024 talk “Build compelling spatial photo and video experiences” for more details Question 23 You mentioned JPEG-XL. What is the current status of support on iOS and macOS for encoding and decoding? For decoding, we support JPEG-XL files in all our OSes, regular SDR files, as well as ISO HDR files. For encoding, we only support JPEG-XL for ProRAW DNG capture in the Camera app or via third-party AVFoundation APIs. If you have any requests for improvement or new features related to JPEG-XL, please file a Feedback request using the Feedback Assistant. (Note: this is part 2 of a 3 part posting. See Part 1 or Part 3)
0
0
258
Jul ’25
WWDC25 Camera & Photos group lab summary (Part 3 of 3)
(Note: this is part 3 of a 3 part posting. See Part 1 or Part 2) At WWDC25 we launched a new type of Lab event for the developer community - Group Labs. A Group Lab is a panel Q&A designed for a large audience of developers. Group Labs are a unique opportunity for the community to submit questions directly to a panel of Apple engineers and designers. Here are the highlights from the WWDC25 Group Lab for Camera & Photos. WWDC25 Camera & Photos group lab ran for one hour at 6 PM PST on Tuesday June 10th, 2025 Question 24 What’s the best approach for optimizing barcode scanning using AVFoundation or Vision in low-light or angled scenarios Turn on flash in low-light scenarios Lower framerate to improve exposure and reduce noise Wait until the capture is in focus/notify your user that they need to get closer Question 25 Recent iPhone models introduced macro mode which automatically switch between lenses to take into account of the focal distance difference. Is there official API to implement this, or should I implement them myself using LiDAR values. Using builtInTripleCamera and builtInDualWideCamera will automatically switch to macro when available Question 26 Is there a way to quickly create a thumbnail after the user selects an image with PhotosPicker? File provider API Additional questions from the WWDC25 in-person labs that occurred later in the WWDC week Question 1 When should I build my custom photo picker instead of using the system one? Always start with the system picker -> try embeddable customization APIs -> fallback to custom picker for very special needs Question 2 I'm building a new camera app for pros and I want to give my users the most un-processed image possible, and the most control over the capture as possible. How can I do that with AVCapture? If stills, Brief Bayer RAW capture overview, or Pro RAW if you want Apple's processing and dynamic range If video, talk about prores LOG. Custom exposure settings are available throguh the apis maybe global/local tonemapping discussion?
0
0
288
Jul ’25
AVCaptureVideoPreviewLayer layoutSublayers invoked on background thread
Opening this question after discussing the issue in the AVCapture lab, hopefully so we can track down this issue. We've been noticing some crashes in App Store Connect caused by layoutSublayers being called on a background thread. After debugging the issue a bit we found that all calls which modified the AVCaptureSession or preview layer were indeed done on the main thread. It would be useful to see what results in AVCaptureVideoPreviewLayer.updateFormatDescription being called. I've attached the crashlog below. Crash log.ips - https://developer.apple.com/forums/content/attachment/800b0dba-3477-4c5a-b56c-f4cc393b384f
1
1
816
Jun ’20
iOS 14, App crash when call presentLimitedLibraryPickerFromViewControlle
iPhone7 : iOS 14.0 Beta 5 Xcode-beta Mac OS : 10.15.5 (19F101) crash info : ** Terminating app due to uncaught exception 'NSInvalidArgumentException', reason: '-[PHPhotoLibrary presentLimitedLibraryPickerFromViewController:]: unrecognized selector sent to instance xxxxxx' terminating with uncaught exception of type NSException my code: (void)viewDidAppear:(BOOL)animated {   [super viewDidAppear:animated];   if (@available(iOS 14, *)) {     [[PHPhotoLibrary sharedPhotoLibrary] presentLimitedLibraryPickerFromViewController:self];   } }
9
0
4.3k
Aug ’20
PHPicker fails to load RAW images
We observed that the PHPicker is unable to load RAW images captured on an iPhone in some scenarios. And it is also somehow related to iCloud. Here is the setup: The PHPickerViewController is configured with preferredAssetRepresentationMode = .current to avoid transcoding. The image is loaded from the item provider like this: if itemProvider.hasItemConformingToTypeIdentifier(kUTTypeImage) { itemProvider.loadFileRepresentation(forTypeIdentifier: kUTTypeImage) { url, error in // work } } This usually works, also for RAW images. However, when trying to load a RAW image that has just been captured with the iPhone, the loading fails with the following errors on the console: [claims] 43A5D3B2-84CD-488D-B9E4-19F9ED5F39EB grantAccessClaim reply is an error: Error Domain=NSCocoaErrorDomain Code=4097 "Couldn’t communicate with a helper application." UserInfo={NSUnderlyingError=0x2804a8e70 {Error Domain=NSCocoaErrorDomain Code=4097 "connection from pid 19420 on anonymousListener or serviceListener" UserInfo={NSDebugDescription=connection from pid 19420 on anonymousListener or serviceListener}}} Error copying file type public.image. Error: Error Domain=NSItemProviderErrorDomain Code=-1000 "Cannot load representation of type public.image" UserInfo={NSLocalizedDescription=Cannot load representation of type public.image, NSUnderlyingError=0x280480540 {Error Domain=NSCocoaErrorDomain Code=4097 "Couldn’t communicate with a helper application." UserInfo={NSUnderlyingError=0x2804a8e70 {Error Domain=NSCocoaErrorDomain Code=4097 "connection from pid 19420 on anonymousListener or serviceListener" UserInfo={NSDebugDescription=connection from pid 19420 on anonymousListener or serviceListener}}}}} We observed that on some devices, loading the image will actually work after a short time (~30 sec), but on others it will always fail. We think it is related to iCloud Photos: On the device that has iCloud Photos sync enabled, the picker is able to load the image right after it was synced to the cloud. On devices that don't sync the image, loading always fails. It seems that the sync process is doing some processing (?) of the image that will later enable the picker to load it successfully, but that's just guessing. Additional observations: This seems to only occur for images that were taken with the stock Camera app. When using Halide to capture RAW (either ProRAW or RAW), the Picker is able to load the image. When trying to load the image as kUTTypeRawImage instead of kUTTypeImage, it also fails. The picker also can't load RAW images that were AirDroped from another device, unless it synced to iCloud first. This is reproducable using the Selecting Photos and Videos in iOS sample code project. We observed this happening in other apps that use the PHPicker, not just ours. Is this a bug, or is there something that we are missing?
7
1
2.8k
Mar ’23
IPadOS 17 external camera exposure
I'm developing iPad app that will be mostly dedicated for certain external camera for visually impaired people. The linux UVC api (e.g. using guvcview) allows to enable automatic exposure for the camera. IOs api "isExposureModeSupported" unfortunately returns false for any of the exposure modes. Is it a bug? Or perhaps AVFoundation doesn't support UVC exposure yet?
1
2
638
Oct ’23
I need a way to permantently disable Reactions from my app, ideally the universe too
So I've spent the last five years optimizing my video AI system so that it runs with less than 5% CPU while processing a 30fps video feed on a Macbook Pro M2, and everything is great, until Sonoma comes out, and I find myself consuming 40% CPU for the exact same workload. So I fire up Instruments, and the "heaviest stack trace" (see screenshot) turns out to be Espresso doing some completely unasked-for and absolutely useless processing on my video frames. I turn off Reactions, but nothing helps - the CPU consumptions stays at 40%. "Reactions" is nothing but a useless toy to please some WWDC keynote fanboys, I don't want it anywhere near my app or my users, and I especially do not want to take the blame for it pissing away the user's CPU cycles and battery. Now, how do I make it go away, for ever? Best regards Jacob
10
6
1.4k
Dec ’23
Generating Live Photo from JPG and MOV fails
I am working on an iOS application using SwiftUI where I want to convert a JPG and a MOV file to a live photo. I am utilizing the LivePhoto Class from Github for this. The JPG and MOV files are displayed correctly in my WallpaperDetailView, but I am facing issues when trying to download the live photo to the gallery and generate the Live Photo. Here is the relevant code and the errors I am encountering: Console prints: Play button should be visible Image URL fetched and set: Optional("https://firebasestorage.googleapis.com/...") Video is ready to play Video downloaded to: file:///var/mobile/Containers/Data/Application/.../tmp/CFNetworkDownload_7rW5ny.tmp Failed to generate Live Photo I have verified that the app has the necessary permissions to access the Photo Library. The JPEG and MOV files are successfully downloaded and can be displayed in the app. The issue seems to occur when generating the Live Photo from the downloaded files. struct WallpaperDetailView: View { var wallpaper: Wallpaper @State private var isLoading = false @State private var isImageSaved = false @State private var imageURL: URL? @State private var livePhotoVideoURL: URL? @State private var player: AVPlayer? @State private var playerViewController: AVPlayerViewController? @State private var isVideoReady = false @State private var showBuffering = false var body: some View { ZStack { if let imageURL = imageURL { GeometryReader { geometry in KFImage(imageURL) .resizable() ... } } if let playerViewController = playerViewController { VideoPlayerViewController(playerViewController: playerViewController) .frame(maxWidth: .infinity, maxHeight: .infinity) .clipped() .edgesIgnoringSafeArea(.all) } } .onAppear { PHPhotoLibrary.requestAuthorization { status in if status == .authorized { loadImage() } else { print("User denied access to photo library") } } } private func loadImage() { isLoading = true if let imageURLString = wallpaper.imageURL, let imageURL = URL(string: imageURLString) { self.imageURL = imageURL if imageURL.scheme == "file" { self.isLoading = false print("Local image URL set: \(imageURL)") } else { fetchDownloadURL(from: imageURLString) { url in self.imageURL = url self.isLoading = false print("Image URL fetched and set: \(String(describing: url))") } } } if let livePhotoVideoURLString = wallpaper.livePhotoVideoURL, let livePhotoVideoURL = URL(string: livePhotoVideoURLString) { self.livePhotoVideoURL = livePhotoVideoURL preloadAndPlayVideo(from: livePhotoVideoURL) } else { self.isLoading = false print("No valid image or video URL") } } private func preloadAndPlayVideo(from url: URL) { self.player = AVPlayer(url: url) let playerViewController = AVPlayerViewController() playerViewController.player = self.player self.playerViewController = playerViewController let playerItem = AVPlayerItem(url: url) playerItem.preferredForwardBufferDuration = 1.0 self.player?.replaceCurrentItem(with: playerItem) ... print("Live Photo Video URL set: \(url)") } private func saveWallpaperToPhotos() { if let imageURL = imageURL, let livePhotoVideoURL = livePhotoVideoURL { saveLivePhotoToPhotos(imageURL: imageURL, videoURL: livePhotoVideoURL) } else if let imageURL = imageURL { saveImageToPhotos(url: imageURL) } } private func saveImageToPhotos(url: URL) { ... } private func saveLivePhotoToPhotos(imageURL: URL, videoURL: URL) { isLoading = true downloadVideo(from: videoURL) { localVideoURL in guard let localVideoURL = localVideoURL else { print("Failed to download video for Live Photo") DispatchQueue.main.async { self.isLoading = false } return } print("Video downloaded to: \(localVideoURL)") self.generateAndSaveLivePhoto(imageURL: imageURL, videoURL: localVideoURL) } } private func generateAndSaveLivePhoto(imageURL: URL, videoURL: URL) { LivePhoto.generate(from: imageURL, videoURL: videoURL, progress: { percent in print("Progress: \(percent)") }, completion: { livePhoto, resources in guard let resources = resources else { print("Failed to generate Live Photo") DispatchQueue.main.async { self.isLoading = false } return } print("Live Photo generated with resources: \(resources)") self.saveLivePhotoToLibrary(resources: resources) }) } private func saveLivePhotoToLibrary(resources: LivePhoto.LivePhotoResources) { LivePhoto.saveToLibrary(resources) { success in DispatchQueue.main.async { if success { self.isImageSaved = true print("Live Photo saved successfully") } else { print("Failed to save Live Photo") } self.isLoading = false } } } private func fetchDownloadURL(from gsURL: String, completion: @escaping (URL?) -> Void) { let storageRef = Storage.storage().reference(forURL: gsURL) storageRef.downloadURL { url, error in if let error = error { print("Failed to fetch image URL: \(error)") completion(nil) } else { completion(url) } } } private func downloadVideo(from url: URL, completion: @escaping (URL?) -> Void) { let task = URLSession.shared.downloadTask(with: url) { localURL, response, error in guard let localURL = localURL, error == nil else { print("Failed to download video: \(String(describing: error))") completion(nil) return } completion(localURL) } task.resume() } }```
1
1
832
Jul ’24
dyld[434]: Library not loaded: error when running LockedCameraCapture compatible app on iOS 15
Hello, I am getting the following error while attempting to run my LockedCameraCapture compatible app on an iOS 15 device: dyld[434]: Library not loaded: '/System/Library/Frameworks/LockedCameraCapture.framework/LockedCameraCapture' Referenced from: '/private/var/containers/Bundle/Application/.../MyApp.app/MyApp.debug.dylib' Reason: tried: '/System/Library/Frameworks/LockedCameraCapture.framework/LockedCameraCapture' (no such file) Of course iOS 15 doesn't have the library for LockedCameraCapture, but I have had no issue including Lock Screen Widgets (which require iOS 16), so I am not sure why the error is popping up. Thank you!
2
0
513
Oct ’24
writeImageAtIndex:1012: ⭕️ ERROR: 'App' is trying to save an opaque image (5712x4284) with 'AlphaLast'.
I have an app that edits photos in your library. When I call try CIContext().writeHEIFRepresentation(of: editedImage, to: fileURL, format: .RGBA8, colorSpace: originalImage.colorSpace!) The following is logged to the console: writeImageAtIndex:1012: ⭕️ ERROR: 'App' is trying to save an opaque image (5712x4284) with 'AlphaLast'. This would unnecessarily increase the file size and will double (!!!) the required memory when decoding the image --> ignoring alpha. What does that mean and how can I resolve it? Xcode Version 16.0 (16A242d) iOS 18.1 (22B82)
5
9
2.9k
Oct ’24
PHLivePhotoEditingContext.saveLivePhoto results in AVFoundation error -11800 "The operation could not be completed" reason An unknown error occurred (-12815)
When trying to edit some Live Photos, calling PHLivePhotoEditingContext.saveLivePhoto results in the following error: Error Domain=AVFoundationErrorDomain Code=-11800 "The operation could not be completed" UserInfo={NSLocalizedFailureReason=An unknown error occurred (-12815), NSLocalizedDescription=The operation could not be completed, NSUnderlyingError=0x300d05380 {Error Domain=NSOSStatusErrorDomain Code=-12815 "(null)"}} I was able to replicate it on my device by taking a new Live Photo. Not sure what's wrong with that one specifically, not all Live Photos replicate the issue. I've submitted FB15880825 with a sysdiagnose and a Photos Diagnostics as well. Any ideas what's going on here? It's impacting multiple customers. Thanks!
1
0
589
Nov ’24
Macro-mode in AVCaptureDevice(custom camera)
Hi, I would like to use macro-mode for the custom camera using AVCaptureDevice in my project. This feature might help to automatically adjust and switch between lenses to get a close up clear image. It looks like this feature is not available and there are no open apis to achieve macro mode from Apple. Is there a way to get this functionality in the custom camera without losing the image quality. Please let me know if this is possible. Thanks you, Adil Thamarasseri
7
1
1.3k
Jan ’25
LockedCameraCapture with ARKit based App
Hello, I'm building a camera app around ARKit. I've created a Lockscreen Capture Extension and added a control to initiate my camera app, but when I launch the extension I see just a black screen with no hints at any errors. Also attaching the debugger to the running process shows no logs. Im wondering: Is LockedCameraCapture supported with ARView and ARSession? ARKit was featured in a WWDC video with a camera app use-case, also the introduction of captureHighResolutionFrame(completion:) made me pick it up as an interesting camera app backbone - but if lockscreen capture is not possible with it I have to refactor my codebase.
1
0
522
Jan ’25
VNDocumentCameraViewController not working on macOS
I have an iPad app that I want to run on Apple Silicon macs. Everything works fine except for VNDocumentCameraViewController. According to the docs this class is available on: iOS 13.0+ iPadOS 13.0+ Mac Catalyst 13.1+ visionOS 1.0+ yet when I try using it I get Document camera is not available on my Mac Studio running macOS 15.2 Is this expected behaviour? Thanks
0
0
386
Jan ’25
Unable to Capture 24MP Photos
Hello, I'm wondering how to capture 24MP photos. I'm currently testing on an iPhone 16 Pro Max. By default, the device's activeFormat supports 24MP (photo dimensions: {4032x3024, 5712x4284}). For the photoOutput, I'm setting the maxPhotoDimensions to videoDevice.activeFormat.supportedMaxPhotoDimensions.lastObject, and setting MaxPhotoQualityPrioritization to quality. When capturing, I'm applying the same maxPhotoDimensions and photoQualityPrioritization settings from the photoOutput directly to the AVCapturePhotoSettings. What could be the issue? // Objective-C // setup [self.photoOutput setMaxPhotoQualityPrioritization:AVCapturePhotoQualityPrioritizationQuality]; CMVideoDimensions maxPhotoDimensions = [(NSValue *)videoDevice.activeFormat.supportedMaxPhotoDimensions.lastObject CMVideoDimensionsValue]; [self.photoOutput setMaxPhotoDimensions:maxPhotoDimensions]; // capturing AVCapturePhotoSettings *photoSettings = [AVCapturePhotoSettings photoSettings]; photoSettings.maxPhotoDimensions = self.photoOutput.maxPhotoDimensions; photoSettings.photoQualityPrioritization = self.photoOutput.maxPhotoQualityPrioritization; [self.photoOutput capturePhotoWithSettings:photoSettings delegate:photoCaptureDelegate]; ...
2
0
677
Jan ’25
Camera USB
am new to using Swift for a Mac Application. I am trying to control an external UVC-compliant camera focus and other capabilities. However, I'm having trouble with this and don't know where to start. I have downloaded an application from the App Store and it can control the focus and other capabilities. I've tried IOKit but this seems to be complicated and this does not return any capabilities or control the camera. I also tried AVfoundation and was able to open the camera, but using the following code did not work for me. as a device.isFocusPointOfInterestSupported returns false and without checking the app crashes. @IBAction func focusChanged(_ sender: NSSlider) { do { guard let device = videoDevice else { return } try device.lockForConfiguration() // Check if focus mode and point of interest are supported if device.isFocusModeSupported(.locked) { device.focusMode = .locked } if device.isFocusPointOfInterestSupported { // Map the slider value (0.0 to 1.0) to the focus point's X coordinate let focusX = CGFloat(sender.doubleValue) let focusPoint = CGPoint(x: focusX, y: 0.5) // Y coordinate is typically 0.5 (centered vertically) device.focusPointOfInterest = focusPoint } else { print("Focus point of interest is not supported on this device.") } device.unlockForConfiguration() // Log focus settings print("Focus point: \(device.focusPointOfInterest)") print("Focus mode: \(device.focusMode.rawValue)") } catch { print("Error adjusting focus: \(error)") } Any help or advice is much appreciated.
0
0
493
Jan ’25
Best `AVMediaType` for depth data.
Dear Apple Developer Forum, I have a question regarding the AVCaptureDevice on iOS. We're trying to capture photos in the best quality possible along with depth data with the highest accuracy possible. We were delighted when we saw AVCaptureDevice could be initialized with the AVMediaType=.depthData which works as expected (depthData is a part of the AVCapturePhoto). When setting to AVMediaType=.video, we still receive depth data (of same quality according to our own internal tests). That confused us. Mind you, we set the device format and depth format as well: private func getDeviceFormat() throws -> AVCaptureDevice.Format { // Ensures high video format and an appropriate color profile. let format = camera?.formats.first(where: { $0.isHighPhotoQualitySupported && $0.supportedDepthDataFormats.count > 0 && $0.formatDescription.mediaSubType.rawValue == kCVPixelFormatType_420YpCbCr8BiPlanarFullRange }) // Check and see if it's available. guard format != nil else { throw CaptureDeviceError.necessaryFormatNotAvailable } return format! } private func getDepthDataFormat(for format: AVCaptureDevice.Format) throws -> AVCaptureDevice.Format { // Access the depth format. let depthDataFormat = format.supportedDepthDataFormats.first(where: { $0.formatDescription.mediaSubType.rawValue == kCVPixelFormatType_DepthFloat32 }) // Check if it exists guard depthDataFormat != nil else { throw CaptureDeviceError.necessaryFormatNotAvailable } // Returns it. return depthDataFormat! } We're wondering, what steps we can take to ensure the best quality photo, along with the most accurate depth data? What properties are the most important, which have an effect, which don't? Are there any ways we can optimize our current configuration? We find it difficult as there's very limited guides and explanations on the media subtypes, for example kCVPixelFormatType_420YpCbCr8BiPlanarFullRange. Is it the best? Is it the best for our use case of high quality photo + most accurate depth data? Important comment: Our App only runs on iPhone 14 Pro, iPhone 15 Pro, iPhone 16 Pro on the latest iOS versions. We hope someone with greater knowledge at Apple can help us and guide us on how we can have the photos of best quality and depth data with most accuracy. Thank you very much! Kind regards.
0
0
411
Jan ’25
photos-navigation://album scheme
In my app SexyPeri (https://apps.apple.com/fr/app/id6738291985), I create an album with some pics Album is called SexyPeri Now, I wish to redirect the user from my app SexyPeri DIRECTLY to the album SexyPeri There is no doc about your scheme photos-navigation, or I didn't see it. Some guys retro-engineered it, but I couldn't make this work. photos-navigation://album?name=SexyPeri doesn't work. So my question is: how can I redirect to the album directly ?
1
0
407
Jan ’25