AVFoundation with lidar and this year's RealityKit Object Capture.

With AVFoundation's builtInLiDARDepthCamera, if I save photo.fileDataRepresentation to heic, it only has Exif and TIFF metadata.

But, RealityKit's object capture's heic image has not only Exif and TIFF, but also has HEIC metadata including camera calibration data.

What should I do for AVFoundation's exported image has same meta data?

Replies

Hi - Did you find out how to overcome the issue?

I noticed that object capture api is embedding rotation, position , intrinsic information into the HEIC metadata.... I know it is possible to get this information if you user SceneKit and ARKit in iOS so I am actually interested in: How to insert this metadata into the correct metadata dictionary key to embed into the Heic Image.

Arkit is amazing.... it lets you access other things like lidar depth maps (not disparity... actual float data).

Hi HeoJin,

I think there might be some information here... https://developer.apple.com/documentation/imageio/image_i_o_constants

I noticed since iOS 16 apple added these extra things to imageio: kIIOCameraExtrinsics_CoordinateSystemID: CFString kIIOCameraExtrinsics_Position: CFString kIIOCameraExtrinsics_Rotation: CFString kIIOCameraModelType_GenericPinhole: CFString kIIOCameraModelType_SimplifiedPinhole: CFString kIIOCameraModel_Intrinsics: CFString kIIOCameraModel_ModelType: CFString kIIOMetadata_CameraExtrinsicsKey: CFString kIIOMetadata_CameraModelKey: CFString

I think they are the same names that appear in the metadata of the image when you view one of the images from the guided capture sample app in photos on Mac OS....

I cannot figure out how to write in these fields in metadata... There doesn't seem to be any dictionary like there is for exif or gps....

apple engineers... help us!