Vision Pro: Access "People Awareness" in my own XR apps

I was wondering if through the "People Awareness" functionality we can access the approximate position of people when creating our own XR apps. For example, this could be used to adjust the position of virtual objects so they do not occlude or collide with the person.

You might want to check if ARKit has a "human" generic model that it may output. ARKit allows you to detect real world objects, but only generic objects that Apple allows... maybe they have a human model for you to detect.

If not, than the answer is no. Why? On Vision Pro, devs cannot access the camera feeds, the passthrough, nor the lidar feeds... Maybe upcoming WWDC will change some things!

Thanks for the suggestion. I think "Face Tracking" and "Body Position Tracking" would do the trick. However, it seems they are unfortunately not available in VisionOS, as they are only listed under iOS currently https://developer.apple.com/documentation/arkit

Vision Pro: Access "People Awareness" in my own XR apps
 
 
Q