Objective and steps
Use the device front true depth camera (iPhone 12 Pro Max) to capture image data, live photo data and metadata (e.g. depth data and portrait effects matte) using AVFoundation capture principles into an AVCapturePhoto
object. Save this captured object with its metadata to PHPhotoLibrary
using a PHAssetCreationRequest
object API.
Result
Image data, live data, disparity depth data (640x480 px) and some metadata is stored with the image through the PHPhotoLibrary
API but the high quality portrait effects matte is lost.
Notes
Upon receiving the AVCapturePhoto
object from AVFoundation capture delegate API I can verify that AVCapturePhoto
object contains a high quality portrait effects matte member object. Using object's fileDataRepresentation()
to obtain Data blob, writing that to a test file URL and reading it back I can see that flattened data API writes and restores the portrait effects matte.
However, it gets stripped from the data when writing through the PHPhotoLibrary asset creation request. When later picking the image e.g. with PHPickerViewController
+ PHPickerResult
and peeking into the object's data with CGImageSourceCopyAuxiliaryDataInfoAtIndex()
I can see that there is data dictionary only for key kCGImageAuxiliaryDataTypeDisparity,
and kCGImageAuxiliaryDataTypeDepth
and kCGImageAuxiliaryDataTypePortraitEffectsMatte
are both missing.
Please, anyone has more detailed information if this possible at all? Thanks!