In WWDC24, visionOS hand tracking has a new function that can make an entity track the hand faster (but at the expense of a certain degree of accuracy), and the video only explains how to implement ARKit, so please ask how to implement the anchorEntiy in the reality view.
Quickly track your hands
Hi @lijiaxu ,
If you mean how you can use an AnchorEntity in an immersive space to add entities that are tracked to your hand, here's a code snippet on how to do that. One thing to note here, AnchorEntities cannot receive collision events with other entities since they're in different coordinate spaces.
struct HandAnchoredEntityView: View {
var body: some View {
RealityView { content, attachments in
// Create an `AnchorEntity` anchored to the head.
let handAnchor = AnchorEntity(.hand(.right, location: .indexFingerTip)) // choose any location here
// Add the anchor to the scene.
content.add(handAnchor)
// Find the entity for the attachment. You can also use an entity loaded from Reality Composer Pro
if let entity = attachments.entity(for: "handAttachment") {
handAnchor.addChild(entity)
}
} attachments: {
Attachment(id: "handAttachment") {
VStack {
Text("hello world")
}
.padding()
.glassBackgroundEffect()
}
}
}
}
Recommended
Got it, sorry for the misunderstanding! You can use https://developer.apple.com/documentation/realitykit/anchoringcomponent/trackingmode-swift.struct/predicted
Try this:
let handAnchor = AnchorEntity(.hand(.right, location: .indexFingerTip), trackingMode: .predicted)
// Add the anchor to the scene.
content.add(handAnchor)
// Find the entity for the attachment.
if let entity = attachments.entity(for: "handAttachment") {
handAnchor.addChild(entity)
}