RealityKit

RSS for tag

Simulate and render 3D content for use in your augmented reality apps using RealityKit.

RealityKit Documentation

Post

Replies

Boosts

Views

Activity

Reality View toggle Entities activation
I'm trying to create a feature in my app vision OS app where i show a reality view and on button click toggle different entities as in showing them on button click and vice versa. Is this possible in Vision os? if so how can i do this ? All that I did now was to instanciate my scene which contains a car 3d model, red tyres and blue tyres. On a button click i'm trying to show the blue tyres instead of the red ones. Is this possible ? Thank you,
1
0
550
Sep ’23
RealityKit box cannot be equipped with different material structures on visionOS
I have generated a box in RealityKit with splitFaces property set to true to allow different materials on each cube side. Applying different SimpleMaterials (e.g. with different colors) works fine on Vision Pro simulator. But combining VideoMaterial and SimpleMaterial does not work. BTW: a 6x video cube can be rendered successfully so the problem seems to be mixing material structures. Here's my relevant code snippet: let mesh = MeshResource.generateBox(width: 0.3, height: 0.3, depth: 0.3, splitFaces: true) let mat1 = VideoMaterial(avPlayer: player) let mat2 = SimpleMaterial(color: .blue, isMetallic: true) let mat3 = SimpleMaterial(color: .red, isMetallic: true) let cube = ModelEntity(mesh: mesh, materials: [mat1, mat2, mat3, mat1, mat2, mat3]) In detail, the video textures are shown whereas the simple surfaces are invisible. Is this a problem of Vision Pro simulator? Or is it not possible to combine different material structures on a box? Any help is welcome!
1
0
451
Sep ’23
Can't center entity on AnchorEntity(.plane)
How can entities be centered on a plane AnchorEntity? On top of the pure existence of the box's offset from the anchor's center, the offset also varies depending on the user's location in the space when the app is being started. This is my code: struct ImmersiveView: View { var body: some View { RealityView { content in let wall = AnchorEntity(.plane(.vertical, classification: .wall, minimumBounds: [2.0, 1.5]), trackingMode: .continuous) let mesh = MeshResource.generateBox(size: 0.3) let box = ModelEntity(mesh: mesh, materials: [SimpleMaterial(color: .green, isMetallic: false)]) box.setParent(wall) content.add(wall) } } } With PlaneDetectionProvider being unavailable on the simulator, I currently don't see a different way to set up entities at least somewhat consistently at anchors in full space.
1
0
613
Sep ’23
Sources for Explosions and Other Assets in USDZ format?
Does anyone know where I can find quality assets in USDZ format? For Unity and Unreal Engine, I just use the built-in asset stores. There seem to be a number of third-party 3D model stores like Laughing Squid, but they tend not to have models in USD format. In particular, I'm looking for some nice-looking explosions for a RealityKit-based visionOS game I'm writing. Some nice boulders would also be useful. Thanks in advance!
0
0
425
Sep ’23
Dragging coordinates issue in VisionOS
I am attempting to place images in wall anchors and be able to move their position using drag gestures. This seem pretty straightforward if the wall anchor is facing you when you start the app. But, if you place an image on a wall anchor to the left or the wall behind the original position then the logic stops working properly. The problem seems to be the anchor and the drag.location3D orientations don't coincide once you are dealing with wall anchors that are not facing the original user position (Using Xcode Beta 8) Question: How do I apply dragging gestures to an image regardless where the wall anchor is located at in relation to the user original facing direction? Using the following code: var dragGesture: some Gesture { DragGesture(minimumDistance: 0) .targetedToAnyEntity() .onChanged { value in let entity = value.entity let convertedPos = value.convert(value.location3D, from: .local, to: entity.parent!) * 0.1 entity.position = SIMD3<Float>(x: convertedPos.x, y: 0, z: convertedPos.y * (-1)) } }
5
1
989
Sep ’23
where to apply "grounding shadow" in Reality Composer Pro?
So there's a "grounding shadow" component we can add to any entity in Reality Composer Pro. In case my use case is Apple Vision Pro + Reality Kit: I'm wondering, by default I think I'd want to add this to any entity that's a ModelEntity in my scene... right? Should we just add this component once to the root transform? Or should we add it to any entity individually if it's a model entity? Or should we not add this at all? Will RealityKit do it for us? Or does it also depend if we use a volume or a full space?
2
0
689
Sep ’23
OrbitAnimation does not work.
Hi, I implemented it as shown in the link below, but it does not animate. https://developer.apple.com/videos/play/wwdc2023/10080/?time=1220 The following message was displayed No bind target found for played animation. import SwiftUI import RealityKit struct ImmersiveView: View { var body: some View { RealityView { content in if let entity = try? await ModelEntity(named: "toy_biplane_idle") { let bounds = entity.model!.mesh.bounds.extents entity.components.set(CollisionComponent(shapes: [.generateBox(size: bounds)])) entity.components.set(HoverEffectComponent()) entity.components.set(InputTargetComponent()) if let toy = try? await ModelEntity(named: "toy_drummer_idle") { let orbit = OrbitAnimation( name:"orbit", duration: 30, axis:[0, 1, 0], startTransform: toy.transform, bindTarget: .transform, repeatMode: .repeat) if let animation = try? AnimationResource.generate(with: orbit) { toy.playAnimation(animation) } content.add(toy) } content.add(entity) } } } }
0
0
622
Aug ’23
How to convert `DragGesture().onEnded`'s velocity CGSize to the SIMD3<Float> required in `PhysicsMotionComponent(linearVelocity, angularVelocity)`?
So if I drag an entity in RealityView I have to disable the PhysicsBodyComponent to make sure nothing fights dragging the entity around. This makes sense. When I finish a drag, this closure gets executed: .gesture( DragGesture() .targetedToAnyEntity() .onChanged { e in // ... } .onEnded { e in let velocity: CGSize = e.gestureValue.velocity } If I now re-add PhysicsBodyComponent to the component I just dragged, and I make it mode: .dynamic it will loose all velocity and drop straight down through gravity. Instead the solution is to apply mode: .kinematic and also apply a PhysicsMotionComponent component to the entity. This should retain velocity after letting go of the object. However, I need to instatiate it with PhysicsMotionComponent(linearVelocity: SIMD3<Float>, angularVelocity: SIMD3<Float>). How can I calculate the linearVelocity and angularVelocity when the e.gestureValue.velocity I get is just a CGSize? Is there another prop of gestureValue I should be looking at?
0
0
453
Aug ’23
Entity disappears when changing position
I have some strange behavior in my app. When I set the position to .zero you can see the sphere normally. But when I change it to any number it doesn't matter which and how small. The Sphere isn't visible or in the view. The RealityView import SwiftUI import RealityKit import RealityKitContent struct TheSphereOfDoomRV: View { @StateObject var viewModel: SphereViewModel = SphereViewModel() let sphere = SphereEntity(radius: 0.25, materials: [SimpleMaterial(color: .red, isMetallic: true)], name: "TheSphere") var body: some View { RealityView { content, attachments in content.add(sphere) } update: { content, attachments in sphere.scale = SIMD3<Float>(x: viewModel.scale, y: viewModel.scale, z: viewModel.scale) } attachments: { VStack { Text("The Sphere of Doom is one of the most powerful Objects. You can interact with him in every way you can imagine ").multilineTextAlignment(.center) Button { } label: { Text("Play Video!") } }.tag("description") }.modifier(GestureModifier()).environmentObject(viewModel) } } SphereEntity: import Foundation import RealityKit import RealityKitContent class SphereEntity: Entity { private let sphere: ModelEntity @MainActor required init() { sphere = ModelEntity() super.init() } init(radius: Float, materials: [Material], name: String) { sphere = ModelEntity(mesh: .generateSphere(radius: radius), materials: materials) sphere.generateCollisionShapes(recursive: false) sphere.components.set(InputTargetComponent()) sphere.components.set(HoverEffectComponent()) sphere.components.set(CollisionComponent(shapes: [.generateSphere(radius: radius)])) sphere.name = name super.init() self.addChild(sphere) self.position = .zero // .init(x: Float, y: Float, z: Float) and [Float, Float, Float] doesn't work ... } }
1
1
580
Aug ’23
How to play video in full immersive space in vision Pro
I am trying to implement a feature to play video in full immersive space but unable to achieve desired results. When I run app in full immersive space, it shows video in center of screen/window. See screenshot below. Can you please guide/refer me how to implement a behaviour to play video in full immersive space, just like below image Regards, Yasir Khan
2
0
2.3k
Aug ’23
Observing RealityKit Components?
So I can observe RealityKit Components by using the new @Observable or using ObservableObject, both of these require my Component to be a class instead of a struct though. I've read that making a Component a class is a bad idea, is this correct? Is there any other way to observe values of an entities' components?
1
0
459
Aug ’23
RealityView is not responding to tap gesture
Hello, I have created a view with a 360 image full view, and I need to perform a task when the user clicks anywhere on the screen (leave the dome), but no matter what I try, it just does not work, it doesn't print anything at all. import SwiftUI import RealityKit import RealityKitContent struct StreetWalk: View { @Binding var threeSixtyImage: String @Binding var isExitFaded: Bool var body: some View { RealityView { content in // Create a material with a 360 image guard let url = Bundle.main.url(forResource: threeSixtyImage, withExtension: "jpeg"), let resource = try? await TextureResource(contentsOf: url) else { // If the asset isn't available, something is wrong with the app. fatalError("Unable to load starfield texture.") } var material = UnlitMaterial() material.color = .init(texture: .init(resource)) // Attach the material to a large sphere. let streeDome = Entity() streeDome.name = "streetDome" streeDome.components.set(ModelComponent( mesh: .generatePlane(width: 1000, depth: 1000), materials: [material] )) // Ensure the texture image points inward at the viewer. streeDome.scale *= .init(x: -1, y: 1, z: 1) content.add(streeDome) } update: { updatedContent in // Create a material with a 360 image guard let url = Bundle.main.url(forResource: threeSixtyImage, withExtension: "jpeg"), let resource = try? TextureResource.load(contentsOf: url) else { // If the asset isn't available, something is wrong with the app. fatalError("Unable to load starfield texture.") } var material = UnlitMaterial() material.color = .init(texture: .init(resource)) updatedContent.entities.first?.components.set(ModelComponent( mesh: .generateSphere(radius: 1000), materials: [material] )) } .gesture(tap) } var tap: some Gesture { SpatialTapGesture().targetedToAnyEntity().onChanged{ value in // Access the tapped entity here. print(value.entity) print("maybe you can tap the dome") // isExitFaded.toggle() } }
1
0
1.3k
Aug ’23
RealityView attachments do not show up in Vision Pro simulator
I have added an attachments closure in an RealityView as outlined in WWDC session "Enhance your spatial computing app with RealityKit" but it's not showing up - neither in Xcode preview window nor in Vision Pro simulator. I have used example code 1:1, however, I had to load the entity async with "try? await" to satisfy the compiler. Any help is appreciated, thx in advance!
5
0
1.5k
Aug ’23
After xCode Beta Update RealityView doesn't work anymore
Hello, Yesterday I updated xCode 15.2 Beta to 15.5. Now i tried to run my App but after starting i I recognized that when I open my volumetric window that the USDA scene in my RealityView doesn't file doesn't appear anymore. RealityView will be executed.+ When I use a 3D Model in my volumetric window the USDA appears without any problems. Here is my code: // // // // Created by Patrick Schnitzer on 30.07.23. // import SwiftUI import RealityKit import RealityKitContent struct MyWindow: View { var body: some View { RealityView { content in async let object = ModelEntity(named: "testScene", in: realityKitContentBundle) print("I exec") guard let object1 = try? await cloudyWeather else {return } object1.generateCollisionShapes(recursive: false) object1.components.set(InputTargetComponent()) object1.location = .init(0,0.5,0) object1.scale *= 1 content.add(object1) }.gesture(dragGesture) } var dragGesture: some Gesture { DragGesture() .targetedToAnyEntity() .onChanged { value in print(value.entity.name) value.entity.position = value.convert(value.location3D, from: .local, to: value.entity.parent!) } } } #Preview { MyWindow() }
2
0
614
Aug ’23