RealityKit

RSS for tag

Simulate and render 3D content for use in your augmented reality apps using RealityKit.

Posts under RealityKit tag

200 Posts
Sort by:

Post

Replies

Boosts

Views

Activity

PortalComponent Clipping Behavior
Hello, I'm experimenting with the PortalComponent and clipping behaviors. My belief was that, with some arbitrary plane mesh, I could have the entire contents of a single world entity that has a PortalCrossingComponent clipped to the boundaries of the plane mesh. Instead, what I seem to be experiencing is that the mesh in the target world of the portal will actually display outside the plane boundaries. I've attached a video that shows the boundaries of my world escaping the portal clipping / transition plane, and also show how, when I navigate below a certain threshold in the scene, I can see what appears to be the "clipped" world ( here, it is obvious to see the dimensions of the clipping plane ), but when I move above a certain level, it appears that the world contents "escape" the clipping behavior. https://scale-assembly-dropbox.s3.amazonaws.com/clipping.mov ( I would have made the above a link but it is not a permitted domain - you can follow that link to see the behavior tho ) It almost seems as if "anything" with PortalCrossingComponent is allowed to appear in the PortalComponent 's parent scene, rather than being clipped by the PortalComponent 's boundary. For reference, the code I'm using is almost identical to the sample code in this document: https://developer.apple.com/documentation/realitykit/portalcomponent with the caveat that I'm using a plane that has .positiveY clipping and portal crossing behaviors, and the clipping plane mesh is as seen in the video. Do I misunderstand how PortalComponent is meant to be used? Or is there a bug in how it currently behaves?
2
0
174
1w
Drag Gestures
Dear all, I am experiencing some problems with the Drag Gesture in VisionOS. Typically, this gesture involves the user pinching an entity or, more commonly, a window, and moving/dragging it around. However, this is not always the case for entities (3D models) placed in the environment. It appears that the user can both pinch and drag and/or move the entity with their bare hands. In the latter case, the onChange cycle doesn't always end if the user keeps their hands near the object, causing it to keep moving even if that is not what the user intends. This also occurs when the user is no longer hovering over that entity. Larger entities, more so than those in the demo "TransformingRealityKitEntitiesUsingGestures," close to the user seem to become attached to their hands, causing the gesture to continue indefinitely. Entities often move to unintended positions. I believe that these two different behaviors within the same gesture container are intrinsically different: one involves pinching and dragging, while the other involves enabling hands physics, and it should be easy to distinguish between the two. How can we correctly address this situation? Thank you for your assistance
3
0
165
1w
EnvironmentResource.generate(fromEquirectangular:) does not compile with Xcode 16.0 beta 2
Hi! It seems that Xcode 16 beta 2 thinks that EnvironmentResource.generate(fromEquirectangular:) is unavailable even when the minimum target remains set to visionOS 1.0 (it is deprecated but Xcode reports a build error). The only way I was able to keep this in place for visionOS 1.x while compiling with Xcode 16 was the following: var environment: EnvironmentResource? if #available(visionOS 2.0, *) { environment = try? await EnvironmentResource(equirectangular: skyBoxWithSun()) } else { fatalError("EnvironmentResource.generate(fromEquirectangular:) does not compile with Xcode 16.0 beta 2.") } #else let environment = try? await EnvironmentResource.generate(fromEquirectangular: skyBoxWithSun()) #endif This will build with both Xcode 15.4 and 16 beta 2, but obviously crash when built with Xcode 16 and run on visionOS 1.x Do I have any better options? I would like to add some visionOS 2.0 features (e.g. try to replace my custom skybox with the new dynamic lighting) but prefer to maintain backward compatibility for now.
1
0
217
1w
Trying to traverse through a usdz file to copy materials from another usdz file to the traversed mesh
Hi All, I am using RealityKit along with ARKit and Swift UI to develop an app where I am augmenting a usdz model of a complex geometry like that of a car. I have some other usdz files with a simple plane geometry having the material properties embedded within them which also i am loading as model entities. I want to traverse through my car usdz file such that i can pick the material from simple usdz file and apply it to the car as car paint. To do this i know the name of the mesh holding the car paint as well as the name of the material applied. I have tried to traverse through the usdz files using both RealityKit and SceneKit but I am not successful to reach to the lowest mesh and copy the material properties to it. With RealityKit, I have tried to get the instance data using modelEntity as follows :- "sourceModel?.model?.mesh.contents.instances". But this returns instance id, model name and transform only. Any help will be highly appreciated. Thank You
1
0
199
1w
RealityKit, DrawableQueue, and synchronizing scene updates
I have a visionOS app that utilizes DrawableQueue and CADisplayLink to update an Entity, TextureResource tied to the drawable, and a Material that uses that TextureResource. TextureResource gets updated with when a video frame is ready. Material properties can get updated from the video or from other sources. Current process: when each video frame is ready, we get the next drawable, render to it, present it, and make an Entity update (e.g. transform). However, I’m experiencing jitter in the rendered content where it seems that the updates to the entity and the drawable being presented are milliseconds off from each other. Should I be using Drawable.presentOnSceneUpdate() to ensure all updates happen in the same update cycle? And if so, do you have any additional details on how to correctly use this function (the docs are unclear)?
0
1
155
1w
PortalComponent invalid
it is my code let portal = Entity() portal.components[ModelComponent.self] = .init(mesh: .generatePlane(width: Float(size.width), height: Float(size.height), cornerRadius: 0.02), materials: [PortalMaterial()]) portal.components[PortalComponent.self] = .init(target: world) portal.components[PortalComponent.self]?.clippingPlane = .init(position: SIMD3(x: 0, y: 0, z: 0), normal: SIMD3(x: 0, y: 0, z: 0)) portal.components.set(HoverEffectComponent()) I added RealityView to multiple HStacks and implemented the portal effect. I found that the portal effect would cause confusion in the rendering level on some machines, as shown in the figure
2
0
187
1w
visionOS object tracking
We have a visionOS project started with visionOS 1.1 and now with visionOS we want to use object tracking so we ad the Anchor component to our scene in Reality Composer Pro and select a referenceObject but when we try to run the app on a physical device we get the following error: [xrsimulator] Exception thrown: This asset (ARReferenceObject_0.compiledarreferenceobject) is not supported by this version of RealityKit.
1
0
119
1w
Biānjí qì (Reality composer pro) guānyú IBL zǔjiàn shèzhì wèntí 36 / 5,000 Editor (Reality Composer Pro) about IBL component settings
In the editor (Reality composer pro), I can edit and mount the IBL component in real time, and the preview effect in the editor is normal. When loading the parsed scene using xcode, some models will appear black. I have tried many model formats (usda, usdc, usdz), and the final effect is the same. However, I can create IBL effects through code, and the effect is normal. I suspect that the IBL component exported by the Realitykit parsing editor has a maximum number of material balls.
0
0
141
1w
App Environment SkyDome's UV values
I started a visionOS app using Apple's new "App Environment" template, and when I looked at the UV mapping for the half SkyDome, the bottom edge had a UV 'Y' value of 0.318. Naively, I had assumed the bottom edge of a half dome would have a UV 'Y' value of 0.5 (half way up the texture map). Is this the standard UV mapping for half a SkyDome? It has caused some issues when I've applied some HDRIs.
0
0
185
2w
How to optimise RealityKit performance with many similar objects
I have code such as the following. The performance on the Vision Pro seems to get quite bad once I hit a few thousand of these models. It feels like I should be able to optimise this somehow, perhaps using instancing. Is that possible with RealityKit in visionOS 2? let material = UnlitMaterial(color: .white) let sphereModel = ModelEntity( mesh: .generateSphere(radius: 0.001), materials: [material]) for index in 0..<5000 { let point = generatedPoints[index] let model = sphereModel.clone(recursive: false) model.position = [point.x, point.y, point.z] parent.addChild(starModel) }
0
0
208
2w
Add a modifier to a single model in the Reality Composer Pro scene
We can add many models in the Reality Composer Pro scene, but when I use RealityView to display and add modifiers in SwiftUI, the modifiers will have Effect, and I don't want to do this. I hope this modifier will be valid for a single model in the Reality Composer Pro scenario. May I ask how to add modifiers to a single model in the Reality Composer Pro scene?
2
0
291
2w