[VisionPro] Placing virtual entities around my arm

Hi everyone, I'm developing a MR Vision Pro app where I’d like to anchor virtual objects (such as UI elements) around the user's arm. However, I’ve noticed that Vision Pro seems to mask out the area where the user’s real arm is, hiding virtual content in that region so that you see your real arm.

Is there a way to render virtual elements on the user's arm—so that it looks like the object is placed directly on the arm despite the real-world passthrough? I was hoping there might be a way to adjust the depth or behavior of this masked-out region. Any insights or workarounds would be greatly appreciated! Thanks :)

Hi, we can use ARKit hand anchors to add items to users hands. There are a ton of anchors for each finger, palm, wrist, etc. But I don't think we get access to arms other than hands. If you need more information on hands, check out these two posts on the forum

https://developer.apple.com/forums/thread/774522?answerId=825213022#825213022

https://developer.apple.com/forums/thread/776079?answerId=828469022#828469022

As for the rendering issue you are talking about, where the arms are occluding virtual content. Can you try using the upperLimbVisibility on your RealityView? Try setting it to hidden to see if that helps.

.upperLimbVisibility(.automatic) //default
.upperLimbVisibility(.hidden)
.upperLimbVisibility(.visible)
[VisionPro] Placing virtual entities around my arm
 
 
Q