I am a newby of spatial computing and I am using ARKit and RealityKit to develop a visionPro app.
I want to accomplish such a goal: If the user's hand touchs an object(an entity in RealityView
) on the table, it will post a Window. But I do not know how to handle the event "the user's hand touchs the object". Should I use hand tracking feature to do some computing by myself? Or is there some api to use directly?
Thank you!
We can use ARKit hand tracking or use AnchorEntity with SpatialTrackingSession. Here is an example with SpatialTrackingSession. This adds some anchors to the users hands, then enables those anchors to collide with other entities in the scene. Once you detect the collisions, you can execute some code to show your window or attachment.
Important: make sure to set this value to none of the anchor will not be able to interact with other entities.
leftIndexAnchor.anchoring.physicsSimulation = .none
This example uses trigger collisions instead of physics. The entities were created in Reality Composer Pro, then loaded in the RealityView.
struct Example021: View {
var body: some View {
RealityView { content in
if let scene = try? await Entity(named: "HandTrackingLabs", in: realityKitContentBundle) {
content.add(scene)
// 1. Set up a Spatial Tracking Session with hand tracking.
// This will add ARKit features to our Anchor Entities, enabling collisions.
let configuration = SpatialTrackingSession.Configuration(
tracking: [.hand])
let session = SpatialTrackingSession()
await session.run(configuration)
if let subject = scene.findEntity(named: "StepSphereRed"), let stepSphereBlue = scene.findEntity(named: "StepSphereBlue"), let stepSphereGreen = scene.findEntity(named: "StepSphereGreen") {
content.add(subject)
// 2. Create an anchor for the left index finger
let leftIndexAnchor = AnchorEntity(.hand(.left, location: .indexFingerTip), trackingMode: .continuous)
// 3. Disable the default physics simulation on the anchor
leftIndexAnchor.anchoring.physicsSimulation = .none
// 4. Add the sphere to the anchor and add the anchor to the scene graph
leftIndexAnchor.addChild(stepSphereBlue)
content.add(leftIndexAnchor)
// Repeat the same steps for the right index finger
let rightIndexAnchor = AnchorEntity(.hand(.right, location: .indexFingerTip), trackingMode: .continuous)
rightIndexAnchor.anchoring.physicsSimulation = .none //
rightIndexAnchor.addChild(stepSphereGreen)
content.add(rightIndexAnchor)
// Example 1: Any entity can collide with any entity. Fire a particle burst
// Allow collision between the hand anchors
// Allow collision between a hand anchor and the subject
_ = content.subscribe(to: CollisionEvents.Began.self) { collisionEvent in
print("Collision unfiltered \(collisionEvent.entityA.name) and \(collisionEvent.entityB.name)")
collisionEvent.entityA.components[ParticleEmitterComponent.self]?.burst()
}
// Example 2: Only track collisions on the subject. Swap the color of the material based on left or right hand.
_ = content
.subscribe(to: CollisionEvents.Began.self, on: subject) { collisionEvent in
print("Collision Subject Color Change \(collisionEvent.entityA.name) and \(collisionEvent.entityB.name)")
if(collisionEvent.entityB.name == "StepSphereBlue") {
swapColorEntity(subject, color: .stepBlue)
} else if (collisionEvent.entityB.name == "StepSphereGreen") {
swapColorEntity(subject, color: .stepGreen)
}
}
}
}
}
}
func swapColorEntity(_ entity: Entity, color: UIColor) {
if var mat = entity.components[ModelComponent.self]?.materials.first as? PhysicallyBasedMaterial {
mat.baseColor = .init(tint: color)
entity.components[ModelComponent.self]?.materials[0] = mat
}
}
}