In visionOS, the virtual content is covered by the hand by default, so I want to know that in the hybrid space, if the distance of an entity is behind a real object, how can the object in the room be covered like the virtual content is covered by the hand?
visionOS keying
Hi @lijiaxu
If I'm understanding correctly, you would like to have virtual objects be occluded by real world objects. Is that correct? For example, you might want a virtual ball to be occluded by a real world table when it rolls under the table.
If so, this forum response may answer your question: https://developer.apple.com/forums/thread/749759
In short, you can get a mesh representing the shape of real world objects in your environment with scene reconstruction or plane detection, and then you can apply an OcclusionMaterial
to that mesh in order to have that object properly occlude your virtual content.
Let me know if you run into any issues!
Hi @lijiaxu
Here'a a modified version of the code from the scene reconstruction sample that creates a mesh from your surroundings with an occlusion material:
struct ImmersiveView: View {
let sceneMaterial = OcclusionMaterial()
let session = ARKitSession()
let sceneReconstruction = SceneReconstructionProvider()
@State private var meshEntities = [UUID: ModelEntity]()
var contentEntity = Entity()
var body: some View {
RealityView { content in
// Run scene reconstruction.
try? await session.run([sceneReconstruction])
// Add the content entity to the scene.
content.add(contentEntity)
}.task {
// Asynchronously process scene reconstruction updates.
await processReconstructionUpdates()
}
}
func processReconstructionUpdates() async {
for await update in sceneReconstruction.anchorUpdates {
let meshAnchor = update.anchor
// Create a mesh for the current anchor.
guard let mesh = try? await MeshResource(from: meshAnchor) else { continue }
switch update.event {
case .added:
let entity = ModelEntity()
entity.transform = Transform(matrix: meshAnchor.originFromAnchorTransform)
// Apply the mesh to the entity with the current scene material.
entity.model = ModelComponent(mesh: mesh, materials: [sceneMaterial])
meshEntities[meshAnchor.id] = entity
contentEntity.addChild(entity)
case .updated:
guard let entity = meshEntities[meshAnchor.id] else { continue }
entity.transform = Transform(matrix: meshAnchor.originFromAnchorTransform)
// Apply the updated mesh to the entity with the current scene material.
entity.model = ModelComponent(mesh: mesh, materials: [sceneMaterial])
case .removed:
meshEntities[meshAnchor.id]?.removeFromParent()
meshEntities.removeValue(forKey: meshAnchor.id)
}
}
}
}
Feel free to change the sceneMaterial
to a SimpleMaterial()
if you want to visualize the scene reconstruction mesh and verify that it is working correctly.
Also, be sure to add an entry for NSWorldSensingUsageDescription
to your app’s information property list to provide a usage description that explains how your app uses the world sensing information.