RealityKit

RSS for tag

Simulate and render 3D content for use in your augmented reality apps using RealityKit.

RealityKit Documentation

Post

Replies

Boosts

Views

Activity

[Newbie] Why does my ShaderGraphMaterial appear distorted?
Disclaimer: I am new to all things 3D. There could be a variety of things wrong with what I'm doing that are not unique to RealityKit. Any domain info would be appreciated. So, I'm following, what I think are, the recommended steps to import a shader-node material from reality composer pro and apply it to another modelEntity. I do the following: guard let entity = try? Entity.load(named: "Materials", in: RealityKitContent.realityKitContentBundle) else { return model } let materialEntity = entity.findEntity(named: "materialModel") as? ModelEntity guard let materialEntity else { return model } I then configure a property on it like so: guard var material = materialEntity.model?.materials[0] as? ShaderGraphMaterial else { return model } try coreMaterial.setParameter(name: "BaseColor", value: .color(matModel.matCoreUIColor)) I then apply it. This is what my texture looks like in RealityComposer: I notice that my rendered object has distortions in the actual RealityView. Note the diagonal lines that appear "Stretched". What could be doing this? I thought Node Shaders were supposed to be more resilient to distortions like this? I'm not sure if I've got a bug or if I'm using it wrong. FWIW, this is a shader based on apple's felt material shader. My graph looks like this: Thanks
2
0
691
Aug ’23
PhotogrammetrySession doesn't accept heif files on the latest Sonoma updates
I'm currently testing Photogrametry by capturing photos with sample project https://developer.apple.com/documentation/realitykit/taking_pictures_for_3d_object_capture Then use them on my laptop with https://developer.apple.com/documentation/realitykit/creating_a_photogrammetry_command-line_app It worked perfectly until the latest updates of Sonoma BETA. It started by warning logs in the console saying I lacked depthMap in my samples and now it just refuse to create samples from my HEIC files. I tried to create HEIC files with and without Depth data to check if it's a bad format of these depth data but it seems it's just the HEIC format itself that is not accepted anymore. I've also just imported HEIC files captured with the standard iOS app and transferred what Photo app and they doesn't work either so it's not an issue of poorly formatted files. If I convert the files in PNG, it works again but of course, as announced during WWDC 2023, I expect to get the photogrammetry pipeline leverage the LIDAR data ! I check every BETA update waiting for an improvement. I can see the photogrammetry logs are never the same so I guess the apple teams are working on it. Of course, the object capture model from Reality Composer pro, also doesn't accept HEIC files anymore. If there are some workarounds, please advise !
0
0
586
Jul ’23
How to access RealityRenderer
I have a RealityView in my visionOS app. I can't figure out how to access RealityRenderer. According to the documentation (https://developer.apple.com/documentation/realitykit/realityrenderer) it is available on visionOS, but I can't figure out how to access it for my RealityView. It is probably something obvious, but after reading through the documentation for RealityView, Entities, and Components, I can't find it.
0
2
480
Jul ’23
Y and Z axis of RealityView coordinate space appear to be flipped in visionOS Simulator
Is this a know bug or is there a funamental misunderstanding on my part? In the screenshot I've attached below I would expect the blue box to be perpendicular to the floor. It is the yAxisEntity in my code which I instatniate with a mesh of height 3. Instead it runs parallel to the floor what I'd expect to the z axis. Here is my code struct ImmerisveContentDebugView: View { @Environment(ViewModel.self) private var model @State var wallAnchor: AnchorEntity = { return AnchorEntity(.plane(.vertical, classification: .wall, minimumBounds: SIMD2<Float>(0.1, 0.1))) }() @State var originEntity: Entity = { let originMesh = MeshResource.generateSphere(radius: 0.2) return ModelEntity(mesh: originMesh, materials: [SimpleMaterial(color: .orange, isMetallic: false)]) }() @State var xAxisEntity: Entity = { let line = MeshResource.generateBox(width: 3, height: 0.1, depth: 0.1) return ModelEntity(mesh: line, materials: [SimpleMaterial(color: .red, isMetallic: false)]) }() @State var yAxisEntity: Entity = { let line = MeshResource.generateBox(width: 0.1, height: 3, depth: 0.1) return ModelEntity(mesh: line, materials: [SimpleMaterial(color: .blue, isMetallic: false)]) }() @State var zAxisEntity: Entity = { let line = MeshResource.generateBox(width: 0.1, height: 0.1, depth: 3) return ModelEntity(mesh: line, materials: [SimpleMaterial(color: .green, isMetallic: false)]) }() var body: some View { RealityView { content in content.add(wallAnchor) wallAnchor.addChild(originEntity) wallAnchor.addChild(xAxisEntity) wallAnchor.addChild(yAxisEntity) wallAnchor.addChild(zAxisEntity) } } } And here is what the simualtor renders
2
1
913
Jul ’23
Image Input for ShaderGraphMaterial
In RealityComposerPro, I've set up a Custom Material that receives an Image File as an input. When I manually select an image and upload it to RealityComposerPro as the input value, I'm able to easily drive the surface of my object/scene with this image. However, I am unable to drive the value of this "cover" parameter via shaderGraphMaterial.setParameter(name: , value: ) in Swift since there is no way to supply an Image as a value of type MaterialParameters.Value. When I print out shaderGraphMaterials.parameterNames I see both "color" and "cover", so I know this parameter is exposed. Is this a feature that will be supported soon / is there a workaround? I assume that if something can be created as an input to Custom Material (in this case an Image File), there should be an equivalent way to drive it via Swift. Thanks!
1
0
872
Jul ’23
VisionOS Simulator DragGestures
Hello, Right now I am learning some RealityKit for VisionOS. I do not receive any errors in my code so it seems okay. But I can't drag my object around. Does the SIM supports gestures in general? // // ImmersiveView.swift // NewDimensionn // // Created by Patrick Schnitzer on 18.07.23. // import SwiftUI import RealityKit import RealityKitContent struct ImmersiveView: View { var earth: Entity = Entity() var moon: Entity = Entity() var body: some View { RealityView { content in async let earth = ModelEntity(named: "EarthScene", in: realityKitContentBundle) async let moon = ModelEntity(named: "MoonScene", in: realityKitContentBundle) if let earth = try? await earth, let moon = try? await moon { content.add(earth) content.add(moon) } } .gesture(DragGesture() .targetedToEntity(earth) .onChanged{ value in earth.position = value.convert(value.location3D, from: .local, to: earth.parent!) } ) } } #Preview { ImmersiveView() .previewLayout(.sizeThatFits) }```
4
0
2k
Jul ’23
How to position windows in the environment in VisionOS?
The below code is my entry point import SwiftUI @main struct KaApp: App { var body: some Scene { WindowGroup { ContentView() } WindowGroup(id:"text-window"){ ZStack{ TextViewWindow().background(.ultraThickMaterial).edgesIgnoringSafeArea(.all) } }.windowStyle(.automatic).defaultSize(width: 0.1, height: 0.1, depth: 1, in: .meters) WindowGroup(id:"model-kala"){ ModelView() }.windowStyle(.volumetric).defaultSize(width: 0.8, height: 0.8, depth: 0.8, in:.meters) WindowGroup(id:"model-kala-2"){ AllModelsView().edgesIgnoringSafeArea(.all) }.windowStyle(.volumetric).defaultSize(width: 1, height: 1, depth: 1, in:.meters) } } I want to place the TextViewWindow exactly near a model that I have placed in the environment. But I'm unable to reposition the window to exactly where I want. if let Armor_Cyber = try? await ModelEntity(named:"Armor_Cyber"), let animation = Armor_Cyber.availableAnimations.first{ Armor_Cyber.playAnimation(animation.repeat(duration: .infinity)) Armor_Cyber.scale = [0.008, 0.008, 0.008] Armor_Cyber.position = [-4, -1, 0.15] let rotation = simd_quatf(angle: -.pi / 6, axis: SIMD3<Float>(0, 1, 0)) * simd_quatf(angle: -.pi / 2, axis: SIMD3<Float>(1, 0, 0)) * simd_quatf(angle: .pi / 2, axis: SIMD3<Float>(0, 0, 1)) Armor_Cyber.transform.rotation = rotation content.add(Armor_Cyber) } How can I place the windowGroup exactly on the right-top of the above model?
2
0
989
Jul ’23
Entity rotation animation doesn't go beyond 180 degree?
I use simple transform and move method to animate my entity. Something like this : let transform = Transform(scale: .one, rotation: simd_quatf(angle: .pi, axis: SIMD3(x:0, y:0, z:1), translate: .zero) myEntity.move(to: transform, relativeTo: myEntity, duration: 1) All is well, but when I try to rotate any more than 180 degree, the rotation stays still ? How do I animate something that wants to turn 360 degree? Thanks
2
0
706
Jul ’23
RealityView update closure not executed upon state change
I have the following piece of code: @State var root = Entity() var body: some View { RealityView { content, _ in do { let _root = try await Entity(named: "Immersive", in: realityKitContentBundle) content.add(_root) // root = _root <-- this doesn't trigger the update closure Task { root = _root // <-- this does } } catch { print("Error in RealityView's make: \(error)") } } update: { content, attachments in // NOTE: update not called when root is modififed // unless root modification is wrapped in Task print(root) // the intent is to use root for positioning attachments. } attachments: { Text("Preview") .font(.system(size: 100)) .background(.pink) .tag("initial_text") } } // end body If I change the root state in the make closure by simply assigning it another entity, the update closure will not be called - print(root) will print two empty entities. Instead if I wrap it in a Task, the update closure would be called: I would see the correct root entity being printed. Any idea why this is the case? In general, I'm unsure the order in which the make, update and attachment closures are executed. Is there more guidance on what we should expect the order to be, what should we do typically in each closure, etc?
1
0
860
Jul ’23
How to get grounding shadow to work in VisionOS?
Hi, I'm trying to replicate ground shadow in this video. However, I couldn't get it to work in the simulator. My scene looks like the following which is rendered as an immersive space: The rocket object has the grounding shadow component with "cast shadow" set to true: but I couldn't see any shadow on the plane beneath it. Things I tried: using code to add the grounding shadow component, didn't work re-used the IBL from the helloworld project to get some lighting for the objects. Although the IBL worked, I still couldn't see the shadow tried adding a DirectionalLight but got an error saying that directional lights are not supported in VisionOS (despite the docs saying the opposite) A related question on lighting: I can see that the simulator definitely applies some scene lighting to objects. But it doesn't seem to do it perfectly. For example in the above screenshot I placed the objects under a transparent ceiling which is supposed to get a lot of lights. But everything is still quite dark.
6
1
2.1k
Jul ’23
RealityKit cannot load Entity
Very often when I try to load an Entity in RealityView the system give me an error: The operation couldn’t be completed. (Swift.CancellationError error 1.) RealityView { content in do { let scene = try await Entity(named: "Test_Scene", in: realityKitContentBundle) content.add(scene) } catch { debugPrint("error loading scene", error.localizedDescription.debugDescription) } } The scene I created contain a basic square This is the main file struct first_app_by_appleApp: App { var body: some Scene { WindowGroup { ContentView() }.windowStyle(.volumetric) } } and inside info.plist file I setup UIApplicationPreferredDefaultSceneSessionRole with UIWindowSceneSessionRoleVolumetricApplication I think that could be a bug of the system but I'm not surprised since it's the first beta. If so, do you know any workarounds?
2
0
846
Jul ’23