RealityKit

RSS for tag

Simulate and render 3D content for use in your augmented reality apps using RealityKit.

Posts under RealityKit tag

200 Posts

Post

Replies

Boosts

Views

Activity

Export Armatures from Blender to USDC for use in RealityKit
I'm an experienced SceneKit developer and I want to begin work on a new project using RealityKit. So I appreciated as timely, the WWDC 2025 Session, "Bring your SceneKit project to RealityKit". However, now I am finding that: Blender does not properly support exporting armatures in usdc files, and usdc is really the only file format that should be used for creating 3D assets for RealityKit. The option of exporting from Blender to fbx or some other intermediate format, and then converting that to usdc, is a challenge. Apple's Reality Converter App, which supposedly can support importing and converting fbx files to usdc, is no longer available from Apple's website. And an older copy of it I found at the Kodeco website requires Rosetta on Apple Silicon. As well, this older copy does not in fact import fbx or anything else - I find it doesn't work at all. Apple's Reality Composer Pro, at least as far as I can tell, only supports importing usdc - it is not a file conversion tool. Alternatively, I am under the impression that Maya supports producing usdc files with armatures, but Maya costs over $2000 per year and I am skilled with Blender, so I believe strongly that I should be able to continue with Blender. Maya's expense and skillset simply shouldn't be a requirement for building RealityKit applications. What are my options then, if any, to produce assets with armatures and armature based animations using Blender, and then bring them into RealityKit?
0
4
248
Jun ’25
RealityKit Trace Metric Max/Range for VisionOS app
Hi Nathaniel, I spoke with you yesterday in the WWDC lab. Thanks for chatting with me! Is it possible to get a link to a doc that has some key metrics I'd find in a RealityKit trace so I know if that metric is exceeding limits and probably causing a problem? Right now, I just see numbers and have no idea if a metric is high or low :). This is specifically for a VisionOS app. Thanks, Bob
3
0
150
Jun ’25
Unwanted file changes in Reality Composer Pro project when using Git
Hello, I'm working on a visionOS project that uses Reality Composer Pro, and we are managing our project files with Git. We've noticed that simply opening and closing the Reality Composer Pro application consistently generates changes in the following files, even when no explicit modifications have been made by the developer: {ProjectName}/Packages/RealityKitContent/Package.realitycomposerpro/PluginData/*******/ShaderGraphEditorPluginID/ShaderGraphEditorPluginID {ProjectName}/Packages/RealityKitContent/Package.realitycomposerpro/WorkspaceData/SceneMetadataList.json Could you please clarify the purpose of these files? Why do they appear as modified when no direct changes are made from our end? More importantly, is it safe to add these files to our .gitignore to prevent them from being tracked by Git? We are concerned that ignoring these files might lead to unexpected issues or inconsistencies when other team members pull the latest changes, especially if these files contain critical project metadata or state that needs to be synchronized. Any insights or recommended best practices for managing Reality Composer Pro projects with Git would be greatly appreciated. Thank you for your time and assistance.
1
0
228
Jun ’25
ProjectiveTransformCameraComponent with custom matrix
I'm looking to create an effect on iOS that tracks the user's face position with ARKit and shifts nearer/more prominent geometry in the scene around while more "distant" geometry stays fixed to the XY plane - making it look like the geometry on screen "sticks out" I've managed to implement most of this successfully, but it's not perfect when using PerspectiveCameraComponent in RealityKit because as I shift the camera (and change its field of view based on the user's distance) the backplane changes its orientation (it's always orthogonal to camera's direction). I've tried adopting ProjectiveTransformCameraComponent instead. The idea is that the camera shifts around the scene, mirroring the user's head's position, looking at (0,0,0) and the back plane is adjusted to be parallel with the X,Y plane (animation replicated in Blender below). However, I can't manage to set up ProjectiveTransformCameraComponent with an appropriate matrix or update its transform property in a RealityKit System correctly. I also tried setting many simpler projection matrices as described in a number of guides on camera projection matrices on the internet and all I get is a blank view. Does anyone have some guidance on what the projection matrix that ProjectiveTransformCameraComponent expects is meant to look like or how I would go about accomplishing my goal?
0
0
141
Jun ’25
Is migrating from ARView to RealityView recommended?
We're using RealityKit to create a science education AR app for iOS, iPadOS, and visionOS. In the WWDC25 session video "Bring your SceneKit project to RealityKit" https://developer.apple.com/videos/play/wwdc2025/288 at 8:15, it's explained that when using RealityKit, RealityView should be used in all cases, whereas in the past, SceneKit required SCNView, SceneView, or ARSCNView, depending on an app's requirements. Because the initial development of our app on iOS predates iOS 18's RealityView, our app currently uses ARView to render RealityKit AR content on iOS and iPadOS. Is it recommended that we migrate to RealityView, or can we safely continue using our existing ARView implementation? We'd prefer to avoid unnecessary development cost. If migrating from ARView to RealityView is recommended, what specific benefits should we expect from this transition? Thank you.
1
2
275
Jun ’25
How to attach SwiftUI Views to entities on non-visionOS platforms?
What is the recommended way to attach SwiftUI views to RealityKit entities on macOS, iOS, etc? All the APIs seem to be visionOS only: https://developer.apple.com/documentation/realitykit/realityviewattachments https://developer.apple.com/documentation/realitykit/viewattachmentcomponent https://developer.apple.com/documentation/realitykit/presentationcomponent https://developer.apple.com/documentation/realitykit/imagepresentationcomponent My only idea is to do it "manually" with a ZStack and RealityView somehow? I submitted this as a feedback since it seemed like an oversight: FB18034856.
0
2
138
Jun ’25
CustomMaterial disable unlit tone mapping
Hi, since iOS 18 UnlitMaterial and ShaderGraphMaterial have the option to disable tone mapping, e.g via https://developer.apple.com/documentation/realitykit/unlitmaterial/init(applypostprocesstonemap:) Is it possible to do the same for CustomMaterial? I tried initializing a CustomMaterial based on an UnlitMaterial where tone mapping is disabled, like so: let unlitMat = UnlitMaterial(applyPostProcessToneMap: false) let customMaterial = try CustomMaterial( from: unlitMat, surfaceShader: surfaceShader, geometryModifier: geometryModifier ) but that does not seem to work. The colors of my texture still look altered in comparison to a plain UnlitMaterial or a ShaderGraphMaterial where its disabled. Any hints? Thank you!
1
0
128
Jun ’25
.usdz files not loading in iOS app
Hello everyone, I'm a new developer and I'm still learning the foundations of Swift and SwiftUI while building my first app. Today I wanted to ask you how to implement AR Quixck Views inside my app. I wanna be able to dynamically preview AR objects in a dedicated view, however, I don't seem to have understood where and how to locate AR objects inside my project. I tried including them in the Assets folder of the project, or in the Recources folder, or within the main folder of my project alongside the MyAppApp.swift file. None of the methods I used seemed have worked in that none of the objects was ever located. I made sure to specify the path to the files every time, but somehow the location isn't recognized. I also tried giving no path so that the app would search for the files in their default location (which I apparently haven't grasped yet), but still my attempt failed. I don't have the code sample on me at the moment, but I will write a followup comment on this post to show you what I wrote in case anyone was interested in debugging my code. Meanwhile, if anyone would be so kind to point me at the support article or to comment below the sample code they used in their app, I would very much appreciate it, so that I can start debugging. Thank you for reading this, I appreciate you.
0
0
137
Jun ’25
Popover, Menu and Sheet not working with RealityView Attachment SwiftUI
Hi, I have a SwiftUI View, that is attached to a 3D object in Reality View. This is supposed to be a HUD for the user to select a few things. I wanted a sub menu for one of the top level buttons. But looks like none of the reasonable choices like Menu, Sheet or Popover work. Is there a known limitation of RealityKit Views where full SwiftUI cannot be used? Or am I doing something wrong? For example, Button { SLogger.info("Toggled") withAnimation { showHudPositionMenu.toggle() } } label: { HStack { Image(systemName: "rectangle.3.group") Text("My Button") } } .popover(isPresented: $showHudPositionMenu, attachmentAnchor: attachmentAnchor) { HudPositionMenuItems(showHudPositionMenu: $showHudPositionMenu, currentHudPosition: $currentHudPosition) } This will print "Toggled" but will not display the MenuItems Popover. If it makes any difference, this is attached to a child of a head tracked entity.
1
0
119
Jun ’25
Spatial Scene API for iOS Apps
As part of the WWDC25 Keynote, a technology was announced that can present 2D images as 3D spatial scenes. This announcement is supported by a Press Release. ...developers can use the Spatial Scene API to make their app experience even more immersive. Zillow is taking advantage of the API for their Zillow Immersive app, allowing users to see images of homes and apartments with the rich depth and dimension that spatial scenes offer. The feature also appears in the Photos App on iOS 26 Developer Beta 1. Tapping "Spatial Scene" on any photo opens a view of that photo with a parallax effect. I've searched the WWDC sessions and new documentation and have come up short. Reaching out here for help. Is there any documentation for Spatial Scene API? Or any guidance on how to implement the spatial scene in iOS?
1
2
346
Jun ’25
Can spatial scene function be used outside of the Photo App?
I'm new here so I don't know what's this function belongs to which topic... Sorry about that! I watched the WWDC stream and I am really interested in this function, I'm wondering if this function could be used in my apps. I looked up the document but I find it only support visionOS(i'm not sure about that, but I saw the demo is base on the visionOS)
2
0
250
Jun ’25
Looking for a way to implement the video display effect in Apple's 'Spatial Gallery'
Hi guys, I noticed that Apple created a really engaging visual effect for browsing spatial videos in the app. The video appears embedded in glass panel with glowing edges and even shows a parallax effect as you move around. When I tried to display the stereo video using RealityView, however, the video entity always floats above the panel. May I ask how does VisionOS implement this effect? Is there any approach to achieve this effect or example code I can use in my own code. Thanks!
3
1
232
Jun ’25
Metal (Compositor Services) or RealityKit on visionOS
I am develop visionOS app. I am now very interested in Metal and Compositor Services, but I have not explored them in depth. I know that Metal has a higher degree of control freedom. I am wondering if using Compositor Services will have fewer functions than RealityKit in AR technology (such as scene reconstruction and understanding, hover effect, etc.).
4
0
252
Jun ’25