Hey, I'm building a portrait mode into my camera app but I'm having trouble with matching the quality of Apples native camera implementation. I'm streaming the depth data and applying a CIMaskedVariableBlur to the video stream which works quite well but the definition of the object in focus looks quite bad in some scenarios. See comparison below with Apples UI + depth data.
What I don't quite understand is how Apple is able to do such a good cutout around my hand assuming it has similar depth data to what I am receiving. You can see in the depth image that my hand is essentially the same colour as parts of background, and this shows in the blur preview - but Apple gets around this.
Does anyone have any ideas?
Thanks!
Photos & Camera
RSS for tagExplore technical aspects of capturing high-quality photos and videos, including exposure control, focus modes, and RAW capture options.
Post
Replies
Boosts
Views
Activity
I updated to iOS 18 developer beta yesterday, and since then all of my 4k photos are displaying as 1080p when I look at them in the photos app. I need help, this is very annoying. Spent multiple hours trying to figure it out.
I have downloaded the beta update to IOS 18, but the Clean Up option for photos is not present.
How to programmatically open the camera the spatial mode in iOS for capturing the spatial videos. Any API for opening camera in spatial mode?
Hello everyone, with the release of Apple's new Final Cut Camera App, we see the possibility to overlay a Focus Peaking indicator over the camera feed, showing focussed areas.
We have already had a contrast based autofocus system for some time via the AVCaptureDevice.Format.AutoFocusSystem.contrastDetection, but I haven't found a way to actually present contrast areas to the user.
Given that Apple now natively has such an algorithm for the Final Cut Camera App, I wonder if we devs now also get access to this. If not, does anybody know of implementations of focus peaking out there?
Thanks and with best regards
Well, I will collect a lot of memes from the Internet and save them on my iPhone. I will name and classify them, but I will click on a photo in "All Photos", and its info does not show which album I added to, which makes me very distressed. If I have this function, I will easily manage the memes that I did not correctly add to the corresponding album.
I've had an app that edits photos in your library since the PhotoKit API was released in iOS 8. I know it was required if you preserve photo metadata you had to change the value of Orientation to 1 (up), otherwise PhotoKit would fail to perform the asset change request. When I remove this code, I'm seeing Orientation is getting changed to 1 automatically both at root and in the TIFF dictionary (tested with iOS 18). I wanted to confirm this is expected behavior, the system does this for us now? If so, can I remove this code for iOS 15+, or was it a recent iOS version this started happening? Thanks!
Is it possible to sort the user library assets by date captured? The Photos app in iOS 18 lets you choose between Date Captured and Recently Added and I want to offer that same choice in my app. This seems to always sort them by creation date (which I believe is the same as recently added):
let assetCollection = PHAssetCollection.fetchAssetCollections(with: .smartAlbum, subtype: .smartAlbumUserLibrary, options: nil).firstObject!
let fetchResult = PHAsset.fetchAssets(in: assetCollection, options: PHFetchOptions.imageMediaType())
I generated an asset in the photolibrary by adding the unedited image, adjustmentData, and edited image with PHAssetCreationRequest.addResouce(). The image is saved in the photolibrary as HEIF. (Check with the photolibrary)
Then, when I save the image generated with PHAssetCreationRequest.addResouce() to the Files app, it becomes JPEG.
On the other hand, even if I edit an image taken with the camera and save it to the Files app, it still maintains the HEIF.
Do you know why this happens?
Also, how can I maintain the HEIF even when saving it in the Files app?
Thanks.
The probability of this issue occurring is very high when retrieving a newly taken Live Photo.
This leads to failure in determining the Live Photo type. How can this issue be resolved?
when importanting photos: I have on serval occasions select delete instead of import.. My hand just Hoover above the 2 option that seat to close …. The pictures would instantly delete from the card. Can you please separate the 2
Hello,
I tried to build AVCam sample application for iOS17 and run it on MacBook (designed as iPad) with macos14.3 (Sonoma).
https://developer.apple.com/documentation/avfoundation/capture_setup/avcam_building_a_camera_app?language=objc
When building and testing with Xcode 15.2, AVCam application crashes systematically when choosing target "My Mac (Designed for iPad)"
In fact, SIGABORT signal is received in a thread dealing with "portrait effect"
Thread 19 Queue : com.apple.portrait.effect_init (serial)
Is it a known bug? Is there a workaround about this case?
Best regards
External webcam is detected by AVCam but preview and capture are systematically upside down. (may be the same FaceTime HD camera's)
Is it a known bug? Is there a workaround about this case?
IIRC, the new photos highlights videos pick your best photos using Apple Intelligence.
Is there an API or metadata available to use AI and get the best photo out of e.g. 5 similar photos?
From what I understood from watching the session LockedCameraCapture, the extension doesn't have access to the App Group User Defaults, so I'm wondering how I can synchronize preferences between the extension and the main app.
From what I can see, the builtin iOS Camera app is able to synchronize its preferences. For example, the "Live Photo" mode toggle state is preserved whether in the main app or in the lock screen.
Is there anything I'm not aware of?
Dear
When I use iOS17 to save videos downloaded from the network to my album, it shows Domain=PHPhotosErrorDomain Code=3302. Through searching official documents, it was found that 3302 means "An error that indicates the asset resource validation failures." However, the specific reason is unknown. Is there any article to explain
The app crashes when creating a new album. This crash did not occur in our own testing, but after publishing it to the app store, it seems that the probability of occurrence is very high.
When I call requestAVAssetForVideo to retrieve a video for upload, the system appends a string of unknown characters to the returned path.
like this:
/var/mobile/Media/DCIM/101APPLE/IMG_1034.MOV#YnBsaXN0MDDRAQJfEBtSZxxxx1vZGUQAAgLKQAAAAAAAAEBAAAAAAAAAAMAAAAAAAAAAAAAAAAAAAAr
ps: I encountered a similar issue before when retrieving spatial videos on systems below iOS 18.
I have an app that uses the ImageCaptureCore's ICDeviceBrowser to find and connect to external digital cameras. Prior to iOS 18 this worked just fine, the device browser would start up and find any cameras connected via USB. However since the update the device browser fails to ever detect any connected device or to trigger any delegate events at all after browser start.
I noticed that the Contents authorization in iOS 18 is undetermined, where in previous iOS versions it would default to authorized. I tried to resolve this by requesting authorization, however this immediately returns denied without ever having prompted the app user for permission. I do have the Camera Privacy Usage description setup, and also am able to request permission for the iOS camera successfully.
How can I successfully request contents authorization via ICC or otherwise?
Or are there alternative Apple libraries I can use for finding and connecting to external digital cameras on iOS?
I got a slow-motion video asset from the camera roll. But I can't get the URL from that asset.
Do you know how to get the URL?
I am trying to use the AVCamFilter Apple sample project discussed in this WWDC session to get depth data using the dual camera. The project has built-in features to get depth data from the dual camera.
When the sample project was written builtInDualWideCamera didn't exist yet, and the project only tries to get builtInDualCamera and builtInWideAngleCamera. When I run the project on my iPad Pro it doesn't show any of the depth-related UI because the device doesn't have a builtInDualCamera device. So I added builtInDualWideCamera in to the videoDeviceDiscoverySession, and it seems to get that device properly, but isDepthDataDeliverySupported is returning false still.
Is there some reason why isDepthDataDeliverySupported is false even though I seem to be using a dual camera device?
I know the device has a builtInLiDARDepthCamera but I wanted to try out the dual camera depth data to see how it performs for shorter distances. I wouldn't have expected the dual camera depth data delivery to be made unavailable on the device just because the LiDAR sensor is already available.
Using iPadOS 17.5.1, iPad Pro 11-inch 4th generation.
The depth feature of this sample app works fine on an iPhone 15 I tested. Also tried on an iPhone 15 Pro and it worked even though that device also has a LiDAR sensor, so the issue is presumably not related to the fact that the iPad Pro has a LiDAR sensor.