RealityKit

RSS for tag

Simulate and render 3D content for use in your augmented reality apps using RealityKit.

RealityKit Documentation

Posts under RealityKit subtopic

Post

Replies

Boosts

Views

Activity

PhotogrammetrySession on non Pro Iphone
Hello, I'm creating an app that use PhotogrammetrySession Class to build 3D objects from photographs (https://developer.apple.com/documentation/realitykit/creating-3d-objects-from-photographs). I'm wondering why this class is working only on Pro iphone (12 Pro, 13 Pro, 14 Pro, 15 Pro and 16 Pro) and none non-Pro iPhone. My app does not use Lidar so it's not the problem. I thought it could be power-related but a18 soc from iPhone 16 is more powerful than a14 bionic from iPhone 12 Pro (i could also mention iPhone 13 Pro and iPhone 14 that both have a15 bionic whereas only the first one is compatible). Did I miss something that could explain these restrictions ? Is there any plan to make this class usable by every iPhone enough powerful to run it ? Thanks in advance for answering me
0
1
573
Nov ’24
RealityView Attachments on iOS 18 & Visually Appealing AR Labeling Alternatives
I want use SwiftUI views as RealityKit entities to display AR Labels within a RealityKit scene, and the labels could be more complicated than just text and window as they might include images, dynamic texts, animations, WebViews, etc. Vision OS enables this through RealityView attachments, and there is a RealityView support on iOS 18. Tried running RealityView attachments code samples from VisionOS on iOS 18. However, the code below gives errors on iOS 18: import SwiftUI import RealityKit struct PassportRealityView: View { let qrCodeCenter: SIMD3<Float> let assetID: String var body: some View { RealityView { content, attachments in // Setup your AR content, such as markers or 3D models if let qrAnchor = try? await Entity(named: "QRAnchor") { qrAnchor.position = qrCodeCenter content.add(qrAnchor) } } attachments: { Attachment(id: "passportTextAttachment") { Text(assetID) .font(.title3) .foregroundColor(.white) .background(Color.black.opacity(0.7)) .padding(5) .cornerRadius(5) } } .frame(width: 300, height: 400) } } When I remove "attachments" keyword and the block, the errors are kind of gone. That does not help me as I want to attach SwiftUI views to Anchor Entities in RealityKit. As I understand, RealityView attachments are not supported on iOS 18. I wonder if there is any way of showing SwiftUI views as entities on iOS 18 at this point. Or am I forced to use the text meshes and 3d planes to build the UI? I checked out the RealityUI plugin, but it's too simple for my use case of building complex AR labels. Any advice would be appreciated. Thanks!
0
0
623
Nov ’24
How to roll a ball by physic in RealityKit
I decided to use a club to kick a ball and let it roll on the turf in RealityKit, but now I can only let it slide but can not roll. I add collision on the turf(static), club (kinematic) and the ball(dynamic), and set some parameters: radius, mass. Using these parameters calculate linear damping, inertia, besides, use time between frames and the club position to calculate speed. Code like these: let radius: Float = 0.025 let mass: Float = 0.04593 // 质量,单位:kg var inertia = 2/5 * mass * pow(radius, 2) let currentPosition = entity.position(relativeTo: nil) let distance = distance(currentPosition, rgfc.lastPosition) let deltaTime = Float(context.deltaTime) let speed = distance / deltaTime let C_d: Float = 0.47 //阻力系数 let linearDamping = 0.5 * 1.2 * pow(speed, 2) * .pi * pow(radius, 2) * C_d //线性阻尼(1.2表示空气密度) entity.components[PhysicsBodyComponent.self]?.massProperties.inertia = SIMD3<Float>(inertia, inertia, inertia) entity.components[PhysicsBodyComponent.self]?.linearDamping = linearDamping // force let acceleration = speed / deltaTime let forceDirection = normalize(currentPosition - rgfc.lastPosition) let forceMultiplier: Float = 1.0 let appliedForce = forceDirection * mass * acceleration * forceMultiplier entityCollidedWith.addForce(appliedForce, at: rgfc.hitPosition, relativeTo: nil) Also I try to applyImpulse but not addForce, like: let linearImpulse = forceDirection * speed * forceMultiplier * mass No matter how I adjust the friction(static, dynamic) and restitution, using addForce or applyImpulse, the ball can only slide. How can I solve this problem?
0
0
565
Nov ’24
RealityComposer text items (arkit/ios18) not displaying
I have created a simple scene in reality composer (composer not composer pro). It contains just a cube and text item. I convert this to usdz file and load it into a Arkit swift app. Since ios 18/xcode 16 - the "text" element is not displayed at all. The cube is displayed, anchors correctly and can be moved etc.... The output from usdchecker ➜ Desktop usdchecker GKTUHR1.6.3.usdz -v --arkit Opening GKTUHR1.6.3.usdz Checking layer <GKTUHR1.6.3.usdz>. Checking package <GKTUHR1.6.3.usdz> Checking prim </Root>. Checking prim </Root/Scenes>. Checking prim </Root/Scenes/Scene>. Checking prim </Root/Scenes/Scene/Gravity>. Checking prim </Root/Scenes/Scene/sceneGroundPlane>. Checking prim </Root/Scenes/Scene/sceneGroundPlane/physicsMaterial>. Checking prim </Root/Scenes/Scene/Children>. Checking prim </Root/Scenes/Scene/Children/hello>. Checking prim </Root/Scenes/Scene/Children/hello/Generated>. Checking prim </Root/Scenes/Scene/Children/hello/Generated/Text>. Checking prim </Root/Scenes/Scene/Children/hello/Generated/Text/Material>. Checking prim </Root/Scenes/Scene/Children/hello/Generated/Text/Material/PBRShader>. Checking shader </Root/Scenes/Scene/Children/hello/Generated/Text/Material/PBRShader>. Checking prim </Root/Scenes/Scene/Children/hello/Children>. Checking prim </Root/Scenes/Scene/Children/Box>. Checking prim </Root/Scenes/Scene/Children/Box/Generated>. Checking prim </Root/Scenes/Scene/Children/Box/Generated/Mesh0>. Checking prim </Root/Scenes/Scene/Children/Box/Generated/Mesh0/Mesh0>. Checking prim </Root/Scenes/Scene/Children/Box/Generated/Mesh0/Material>. Checking prim </Root/Scenes/Scene/Children/Box/Generated/Mesh0/Material/PBRShader>. Checking shader </Root/Scenes/Scene/Children/Box/Generated/Mesh0/Material/PBRShader>. Checking prim </Root/Scenes/Scene/Children/Box/Children>. Checking prim </Root/Scenes/Scene/Children/Box/PhysicsMaterial_Box>. Found material bindings but no MaterialBindingAPI applied on the prim </Root/Scenes/Scene/sceneGroundPlane>. (fails 'MaterialBindingAPIAppliedChecker') Found material bindings but no MaterialBindingAPI applied on the prim </Root/Scenes/Scene/Children/hello/Generated/Text>. (fails 'MaterialBindingAPIAppliedChecker') Found material bindings but no MaterialBindingAPI applied on the prim </Root/Scenes/Scene/Children/Box>. (fails 'MaterialBindingAPIAppliedChecker') Found material bindings but no MaterialBindingAPI applied on the prim </Root/Scenes/Scene/Children/Box/Generated/Mesh0>. (fails 'MaterialBindingAPIAppliedChecker') Failed!
0
0
610
Nov ’24
Disable IBL in IOS QUICKLOOK
https://developer.apple.com/documentation/arkit/arkit_in_ios/specifying_a_lighting_environment_in_ar_quick_look How can I disable it? or at least use a custom texture that's just black? I don't see the purpose of having the real-time environment probe that captures IBL, but always add this fake studio IBL that you can't remove...
0
0
328
Dec ’24
Place a box on a wall or on the floor
Hi, I wanted to do something quite simple: Put a box on a wall or on the floor. My box: let myBox = ModelEntity( mesh: .generateBox(size: SIMD3<Float>(0.1, 0.1, 0.01)), materials: [SimpleMaterial(color: .systemRed, isMetallic: false)], collisionShape: .generateBox(size: SIMD3<Float>(0.1, 0.1, 0.01)), mass: 0.0) For that I used Plane Detection to identify the walls and floor in the room. Then with SpatialTapGesture I was able to retrieve the position where the user is looking and tap. let position = value.convert(value.location3D, from: .local, to: .scene) And then positioned my box myBox.setPosition(position, relativeTo: nil) When I then tested it I realized that the box was not parallel to the wall but had a slightly inclined angle. I also realized if I tried to put my box on the wall to my left the box was placed perpendicular to this wall and not placed on it. After various searches and several attempts I ended up playing with transform.matrix to identify if the plane is wall or a floor, if it was in front of me or on the side and set up a rotation on the box to "place" it on the wall or a floor. let surfaceTransform = surface.transform.matrix let surfaceNormal = normalize(surfaceTransform.columns.2.xyz) let baseRotation = simd_quatf(angle: .pi, axis: SIMD3<Float>(0, 1, 0)) var finalRotation: simd_quatf if acos(abs(dot(surfaceNormal, SIMD3<Float>(0, 1, 0)))) < 0.3 { logger.info("Surface: ceiling/floor") finalRotation = simd_quatf(angle: surfaceNormal.y > 0 ? 0 : .pi, axis: SIMD3<Float>(1, 0, 0)) } else if abs(surfaceNormal.x) > abs(surfaceNormal.z) { logger.info("Surface: left/right") finalRotation = simd_quatf(angle: surfaceNormal.x > 0 ? .pi/2 : -.pi/2, axis: SIMD3<Float>(0, 1, 0)) } else { logger.info("Surface: front/back") finalRotation = baseRotation } Playing with matrices is not really my thing so I don't know if I'm doing it right. Could you tell me if my tests for the orientation of the walls are correct? During my tests I don't always correctly identify whether the wall is in front or on the side. Is this generally the right way to do it? Is there an easier way to do this? Regards Tof
0
0
403
Feb ’25
RealityView IOS Navigation
I have a visionOS app that I’m adding support for IOS and will like to keep using RealityView. I know there are the following modifiers to add some navigation .realityViewCameraControls(.orbit) .realityViewCameraControls(.dolly) .realityViewCameraControls(.pan) But how can I add more than one? For example I would like to orbit with one finger, Pan with 2 fingers and dolly by pinching. Is this possible and if so can someone share some sample code on how to achieve that? Thanks, Guillermo
0
1
410
Feb ’25
Portals do not occlude CollisionComponent and InputTargetComponent
Hello If you add a ModelEntity to a world inside a portal, the drawing of the model will be occluded properly to the portal bounds. However the invisible shape of the InputTargetComponent and CollisionComponent are not occluded. They are able to cross the portal, and if you have gestures on your ModelEntity you can trigger them in areas outside the portal bounds. This happens even if the ModelEntity has no PortalCrossingComponent.
0
1
49
Mar ’25
Reality Composer Pro Transparent Textures
Hey everyone, I am currently developing an app in visionOS and using RealityComposerPro create scenes in put in my app. I have a humanoid model with hair strands, and each strand of hair has an opacity map. However, some reflections are still visible even though the opacity is zero. There are also some weird culling among hair strands (in the left circle) and weird reflections in hair cards (in the right circle). Here's my settings for the materials. Since all the hair strands are interconnected with each other, it is hard to decide the drawing order in Xcode, so I am wondering if there's an easier way to handle transparency objects. Please let me know if you know anything helpful, much appreciated!
0
0
92
Apr ’25
Is Using Metal Compute Shaders for Efficient Resource Copying to RealityKit the Best Approach for Streaming Data in Real-Time Rendering?
Hi Apple, In VisionOS, for real-time streaming of large 3D scenes, I plan to create Metal buffers and textures in multiple threads and then use a compute shader on the main thread to copy the Metal resources into RealityKit, minimizing main thread usage. Given that most of RealityKit's default APIs require execution on the main actor (main thread), it is not ideal for streaming data. Is this approach the best way to handle streaming data and real-time rendering? Thank you very much.
0
0
76
Apr ’25
How to use CharacterControllerComponent.
I am trying to implement a ChacterControllerComponent using the following URL. https://developer.apple.com/documentation/realitykit/charactercontrollercomponent I have written sample code, but PhysicsSimulationEvents.WillSimulate is not executed and nothing happens. import SwiftUI import RealityKit import RealityKitContent struct ImmersiveView: View { let gravity: SIMD3<Float> = [0, -50, 0] let jumpSpeed: Float = 10 enum PlayerInput { case none, jump } @State private var testCharacter: Entity = Entity() @State private var myPlayerInput = PlayerInput.none var body: some View { RealityView { content in // Add the initial RealityKit content if let immersiveContentEntity = try? await Entity(named: "Immersive", in: realityKitContentBundle) { content.add(immersiveContentEntity) testCharacter = immersiveContentEntity.findEntity(named: "Capsule")! testCharacter.components.set(CharacterControllerComponent()) let _ = content.subscribe(to: PhysicsSimulationEvents.WillSimulate.self, on: testCharacter) { event in print("subscribe run") let deltaTime: Float = Float(event.deltaTime) var velocity: SIMD3<Float> = .zero var isOnGround: Bool = false // RealityKit automatically adds `CharacterControllerStateComponent` after moving the character for the first time. if let ccState = testCharacter.components[CharacterControllerStateComponent.self] { velocity = ccState.velocity isOnGround = ccState.isOnGround } if !isOnGround { // Gravity is a force, so you need to accumulate it for each frame. velocity += gravity * deltaTime } else if myPlayerInput == .jump { // Set the character's velocity directly to launch it in the air when the player jumps. velocity.y = jumpSpeed } testCharacter.moveCharacter(by: velocity * deltaTime, deltaTime: deltaTime, relativeTo: nil) { event in print("playerEntity collided with \(event.hitEntity.name)") } } } } } } The scene is loaded from RCP. It is simple, just a capsule on a pedestal. Do I need a separate code to run testCharacter from this state?
0
0
100
May ’25
iOS Simulator can only render 1 RealityView
I'm using RealityView in my iOS game mxied with SwiftUI. For the following 2 example usages, the simulator will only render the first RealityView, and the second one is either super laggy or show a black model. Running on the real device is all good, just simualtor has this issue. Have a TabView and each tab has a RealityView. Have a root view and detail view connected via a push navigation, both root and detail have a RealityView. In the Simulator, the second RealityView is going to be very choppy and basically unusable, but on a real iPhone everything looks great. Is this a known simulator issue or I did something bad?
0
0
96
Jun ’25
How to attach SwiftUI Views to entities on non-visionOS platforms?
What is the recommended way to attach SwiftUI views to RealityKit entities on macOS, iOS, etc? All the APIs seem to be visionOS only: https://developer.apple.com/documentation/realitykit/realityviewattachments https://developer.apple.com/documentation/realitykit/viewattachmentcomponent https://developer.apple.com/documentation/realitykit/presentationcomponent https://developer.apple.com/documentation/realitykit/imagepresentationcomponent My only idea is to do it "manually" with a ZStack and RealityView somehow? I submitted this as a feedback since it seemed like an oversight: FB18034856.
0
2
84
Jun ’25
ProjectiveTransformCameraComponent with custom matrix
I'm looking to create an effect on iOS that tracks the user's face position with ARKit and shifts nearer/more prominent geometry in the scene around while more "distant" geometry stays fixed to the XY plane - making it look like the geometry on screen "sticks out" I've managed to implement most of this successfully, but it's not perfect when using PerspectiveCameraComponent in RealityKit because as I shift the camera (and change its field of view based on the user's distance) the backplane changes its orientation (it's always orthogonal to camera's direction). I've tried adopting ProjectiveTransformCameraComponent instead. The idea is that the camera shifts around the scene, mirroring the user's head's position, looking at (0,0,0) and the back plane is adjusted to be parallel with the X,Y plane (animation replicated in Blender below). However, I can't manage to set up ProjectiveTransformCameraComponent with an appropriate matrix or update its transform property in a RealityKit System correctly. I also tried setting many simpler projection matrices as described in a number of guides on camera projection matrices on the internet and all I get is a blank view. Does anyone have some guidance on what the projection matrix that ProjectiveTransformCameraComponent expects is meant to look like or how I would go about accomplishing my goal?
0
0
109
Jun ’25
Export Armatures from Blender to USDC for use in RealityKit
I'm an experienced SceneKit developer and I want to begin work on a new project using RealityKit. So I appreciated as timely, the WWDC 2025 Session, "Bring your SceneKit project to RealityKit". However, now I am finding that: Blender does not properly support exporting armatures in usdc files, and usdc is really the only file format that should be used for creating 3D assets for RealityKit. The option of exporting from Blender to fbx or some other intermediate format, and then converting that to usdc, is a challenge. Apple's Reality Converter App, which supposedly can support importing and converting fbx files to usdc, is no longer available from Apple's website. And an older copy of it I found at the Kodeco website requires Rosetta on Apple Silicon. As well, this older copy does not in fact import fbx or anything else - I find it doesn't work at all. Apple's Reality Composer Pro, at least as far as I can tell, only supports importing usdc - it is not a file conversion tool. Alternatively, I am under the impression that Maya supports producing usdc files with armatures, but Maya costs over $2000 per year and I am skilled with Blender, so I believe strongly that I should be able to continue with Blender. Maya's expense and skillset simply shouldn't be a requirement for building RealityKit applications. What are my options then, if any, to produce assets with armatures and armature based animations using Blender, and then bring them into RealityKit?
0
4
102
Jun ’25
Regression: RealityKit spatial audio crackles and pops on iOS 26.0 beta 5 (FB19423059)
RealityKit spatial audio crackles and pops on iOS 26.0 beta 5. It works correctly on iOS 18.6 and visionOS 26.0 beta 5. The APIs used are AudioPlaybackController, Entity.prepareAudio, Entity.play Videos of the expected and observed behavior are attached to the feedback FB19423059. The audio should be a consistent, repeating sound, but it seems oddly abbreviated and the volume varies unexpectedly. Thank you for investigating this issue.
0
0
226
Aug ’25