Turn physical surface into touchscreen in VisionOS

In VisionOS is it possible to detect when a user is touching a physical surface in the real world and also to project 2D graphics on that surface? So imagine a windowless 2D app that is projected onto a surface, essentially turning a physical wall, table, etc. into a giant touchscreen?

So kinda like this:

https://appleinsider.com/articles/23/06/23/vision-pro-will-turn-any-surface-into-a-display-with-touch-control

But I want every surface in the room to be touchable and be able to display 2D graphics on the face of that surface and not floating in space. So essentially turning every physical surface in the room into a UIView.

Thanks!

Yes. surface geometry can be retrieved textures on geometries can be drawn/animated hand position can be retrieved its all possible

ARKit in visionOS provides information (ARPlaneAnchor) about the position, orientation and size of a real horizontal or vertical plane. https://developer.apple.com/documentation/arkit/arplaneanchor

In addition, LiDAR 3D measurement information (ARDepthData) is processed to provide mesh information (ARMeshAnchor). https://developer.apple.com/documentation/arkit/armeshanchor

So, in theory, app developers can render 2D graphics with desired photos and videos on ARPlaneAnchor and ARMeshAnchor.

In addition, through the analysis of ARDepthData or ARMeshAnchor, various AR applications are possible by accurately determining the shape, size, position and direction of real curved surfaces in real time.

YouTube BmKNmZCiMkw YouTube 9QkSPkLIfWU

Turn physical surface into touchscreen in VisionOS
 
 
Q