Truedepth Camera on iPhone13 products - lower accuracy of depth data?

Hi,

just experienced using the Apple demo app for Truedepth images on different devices that there are significant differences in the provided data quality.

Data derived on iPhones before iPhone 13 lineup provide quite smooth surfaces - like you may know from one of the many different 3D scanner apps displaying data from the front facing Truedepth camera.

Data derived on e.g. iPhone13 Pro has now some kind of wavy overlaying structures and has - visually perceived - very low accuracy compared to older iPhones.

iPhone 12pro: data as pointcloud, object about 25cm from phone:

iPhone 13pro: data as pointcloud, same setup

Tried it out on different iPhone 13 devices, same result, all running on latest iOS. Images captured with the same code. Capturing by using some of the standard 3D scanner apps for the Truedepth camera are providing similar lower quality images or point clouds.  

Is this due to degraded hardware (smaller sized Truedepth camera) on new iPhone release or a software issue from within iOS, e.g. driver part of the Truedepth camera?

 

Are there any foreseen improvements or solutions already announced by Apple?

 

Best

Holger

Post not yet marked as solved Up vote post of r3176 Down vote post of r3176
5.6k views

Replies

I can confirm that we have experienced the exact same findings.

Furthermore, beyond the noisy depth data returned (likely due to switching to a ToF sensor, which then appears to be upsampled which might explain the "noise"), there seems to be a very concerning bug with regards to how depth data is handled on one side of the face. In all previous devices with old version of TrueDepth (prior to iPhone 13), there is clear separation of z-depth data. HOWEVER, on iPhone 13 variants, there is "blending" of the z-depth on the edge of one side of the face. This is a huge red flag (tested on iOS 15.x). This blending is apparent if you run unaltered apple sample code found here on an iPhone 13 variant (iPhone 13, 13 Pro, 13 Pro Max, 13 mini). https://developer.apple.com/documentation/avfoundation/cameras_and_media_capture/streaming_depth_data_from_the_truedepth_camera

Implementing filtering/smoothing of the data does NOT fix this concerning bug. It is temporal filters, so while it does aid in reducing the "bumpy noise" returned from the sensor, the incorrect Z-depth on one side of the face (or any object scanned) is not fixed. And temporal smoothing can be done by averaging across multiple depth buffers (its nice this is done for you), but I want clean data on a per-depth-frame returned.

Huge red flag bug in returned depth data from iPhone 13 devices (aside from noisy/bumpy/inaccurate depth). Highlighting issue in previous image posted.

Highlighting (in previous image posted) the bug I mentioned in returned depth data from any iPhone 13 variant device.

  • Thank you so much for posting this information! I am very interested in seeing this comparison with the iPhone 15 since I am deciding which device to buy for this specific purpose alone.

    Thanks in advance.

    PS - would also be interested in knowing how to generate these types of tests

Add a Comment

I’m not am apple engineer or anything but I’m pretty sure the issue is caused because of how close together the sensors she now, if you think about it the dot projector and IR cameras need to be squeezed closer, which would decrease the parallax off The dots and the camera, all I think it’sa hardware fault rather than software unfortunately

Any news from an Apple Developer?

I have also noticed this issue. After the iPhone 13, the devices capture a point cloud that appears uneven and wavy, like ripples in water, when scanning the same wall. Do you have any solutions for this?

I can confirm your findings, too. Unfortunately it seems like Apple choses cost reduction and smaller sensor size over data quality when they switched the sensor manufacturer to LG starting with the iPhone 13.

It was a killer 3D sensor in many many user's hands, having a lot of fans in the medical / body scanning field. The newer sensor will scare away everybody that looks at the raw data. Sad story :-(

  • Do you know if it's the same on the iPhone 14?

  • It is the same for iPhone 14, and even worse, iPhone 15 appears to be giving even worse results :|. Besides noise, we also discovered that on iPhone 13, 14 and 15 the distances return by the AVCaptureSession are wrong, and that the focal distance from the camera calibration is also wrong (also on iPad Pro 12.9" 5th and 6th gen and iPad Pro 11" 3rd and 4th gen).

  • Thank you, GeertB. Been looking for this specific information for... too long. Would appreciate any visuals to help substantiate this claim, such as Holger posted. BTW: If you install Live Link Face, (free, from Epic Games), Metahuman Animator, Settings, Preview Depth you can display depth capture information and do a screen capture of that. Of course, there are probably easier ways to get that information, via programming.

Add a Comment

Can confirm this, too. When using livelink face with metahuman animator (utilizing depth data) for high quality facial motion capture, you get better results in Unreal Engine with iphone 11 and 12 than with iphone 13. While using it with iphone 13 is still far superior than ARkit based capture, iphone 11 & 12 provide much greater fidelity. This is crazy. it makes the 12 an absolute gem for anyone doing face motion capture (as 11 is officially unsupported).

What is clear is that if you can solve a problem with software, you can always save on hardware costs. Even if the data quality is low, if the desired results can be achieved through software, inexpensive hardware should be applied. That is why the role of mathematics stands out.

Here is an example of processing 3D face point cloud data (2019) by CurvSurf FindSurface.

Nose extraction - iPhone X, SR300 https://youtu.be/eCmVYl3GIEY

Thanks JoonAhn. Tried to check out that video, but it shows as private.

  • Sorry... Now the YouTube video is now visible, i.e., you only need to know the URL.

  • The video "Nose extraction - iPhone X, SR300" https://youtu.be/eCmVYl3GIEY is really old one. Since 2020 Apple LiDAR and depthMap are available, there new technical possibilities are realized and demonstrated.

    YouTube CurvSurf https://www.youtube.com/CurvSurf

    GitHub CurvSurf https://github.com/CurvSurf

Add a Comment