RealityKit

RSS for tag

Simulate and render 3D content for use in your augmented reality apps using RealityKit.

RealityKit Documentation

Posts under RealityKit subtopic

Post

Replies

Boosts

Views

Activity

RealityKit visionOS anchor to POV
Hi, is there a way in visionOS to anchor an entity to the POV via RealityKit? I need an entity which is always fixed to the 'camera'. I'm aware that this is discouraged from a design perspective as it can be visually distracting. In my case though I want to use it to attach a fixed collider entity, so that the camera can collide with objects in the scene. Edit: ARView on iOS has a lot of very useful helper properties and functions like cameraTransform (https://developer.apple.com/documentation/realitykit/arview/cameratransform) How would I get this information on visionOS? RealityViews content does not seem offer anything comparable. An example use case would be that I would like to add an entity to the scene at my users eye-level, basically depending on their height. I found https://developer.apple.com/documentation/realitykit/realityrenderer which has an activeCamera property but so far it's unclear to me in which context RealityRenderer is used and how I could access it. Appreciate any hints, thanks!
9
6
5.1k
Dec ’24
Scene with over 12,000 duplicate entities
I am creating a RealityKit scene that will contain over 12,000 duplicate cubes arranged in a circle (see image below). This is for some high-energy physical simulations I am doing. I accomplish this scene by creating a single cube and cloning it a bunch of times. So, I there is a single MeshResource and Material even though there are a lot of entities. I have confirmed this by checking with Swift's === operator. Even with this, the program is unworkably slow. Any suggestions or tricks that could help with this type of scene? Using a single geometry was the trick to getting SceneKit to work fast with geometries like this. I've been updating my software to RealityKit because I far prefer the structure of RealityKit over SceneKit.
4
0
1.3k
Jan ’25
Sample code for WWDC25 Session
Hi! I watched the WWDC25 session "Bring your SceneKit project to RealityKit" which seemed like a great resource for those of us transitioning from the now-deprecated SceneKit framework. The session mentioned that the full sample code for the project would be available to download, but I haven't been able to find it in the Code section of the video page or in the Sample Code Library. Has the sample code been released yet? Having the project code would make it much easier to follow along with the RealityKit changes shown in the video. Thanks again for the great session.
4
4
246
Jun ’25
Export Armatures from Blender to USDC for use in RealityKit
I'm an experienced SceneKit developer and I want to begin work on a new project using RealityKit. So I appreciated as timely, the WWDC 2025 Session, "Bring your SceneKit project to RealityKit". However, now I am finding that: Blender does not properly support exporting armatures in usdc files, and usdc is really the only file format that should be used for creating 3D assets for RealityKit. The option of exporting from Blender to fbx or some other intermediate format, and then converting that to usdc, is a challenge. Apple's Reality Converter App, which supposedly can support importing and converting fbx files to usdc, is no longer available from Apple's website. And an older copy of it I found at the Kodeco website requires Rosetta on Apple Silicon. As well, this older copy does not in fact import fbx or anything else - I find it doesn't work at all. Apple's Reality Composer Pro, at least as far as I can tell, only supports importing usdc - it is not a file conversion tool. Alternatively, I am under the impression that Maya supports producing usdc files with armatures, but Maya costs over $2000 per year and I am skilled with Blender, so I believe strongly that I should be able to continue with Blender. Maya's expense and skillset simply shouldn't be a requirement for building RealityKit applications. What are my options then, if any, to produce assets with armatures and armature based animations using Blender, and then bring them into RealityKit?
0
4
102
Jun ’25
Pink Screen with VideoMaterial in ARKit
Hi everyone, I'm developing an ARKit app using RealityKit and encountering an issue where a video displayed on a 3D plane shows up as a pink screen instead of the actual video content. Here's a simplified version of my setup: func createVideoScreen(video: AVPlayerItem, canvasWidth: Float, canvasHeight: Float, aspectRatio: Float, fitsWidth: Bool = true) -> ModelEntity { let width = (fitsWidth) ? canvasWidth : canvasHeight * aspectRatio let height = (fitsWidth) ? canvasWidth * (1/aspectRatio) : canvasHeight let screenPlane = MeshResource.generatePlane(width: width, depth: height) let videoMaterial: Material = createVideoMaterial(videoItem: video) let videoScreenModel = ModelEntity(mesh: screenPlane, materials: [videoMaterial]) return videoScreenModel } func createVideoMaterial(videoItem: AVPlayerItem) -> VideoMaterial { let player = AVPlayer(playerItem: videoItem) let videoMaterial = VideoMaterial(avPlayer: player) player.play() return videoMaterial } Despite following the standard process, the video plane renders pink. Has anyone encountered this before, or does anyone know what might be causing it? Thanks in advance!
16
3
1.1k
May ’25
BlendShapes don’t animate while playing animation in RealityKit
Hi everyone, I’m running into an issue with RealityKit when trying to animate BlendShapes (ShapeKeys) while a skeletal animation is playing. The model is a rigged character in .usdz format with both predefined skeletal animations and BlendShapes (exported from Blender). The problem: when I play any animation using entity.playAnimation(...), the BlendShapes stop responding. Calling setBlendShapes(...) still logs that weights are being updated, but the visual changes are not visible. The exact same blend shape animation works perfectly when no animation is playing. In SceneKit the same model works as expected: shape keys get animated during animation playback. But not in realitykit Still, as soon as an animation starts, the shape keys don’t animate anymore. Here’s the test project on GitHub that demonstrates the issue clearly: https://github.com/IAMTHEBURT/RealityKitWitnBlendShapesSample The goal is to play facial expressions (like blinking or talking) while a body animation (like waving) is playing. Is this a known limitation in RealityKit? Or is there a recommended way to combine skeletal animations with real-time BlendShape updates? Thanks in advance for any insights.
3
3
245
Jul ’25
How to attach point cloud(or depth data) to heic?
I'm developing 3D Scanner works on iPad. I'm using AVCapturePhoto and Photogrammetry Session. photoCaptureDelegate is like below: extension PhotoCaptureDelegate: AVCapturePhotoCaptureDelegate { func photoOutput(_ output: AVCapturePhotoOutput, didFinishProcessingPhoto photo: AVCapturePhoto, error: Error?) { let fileUrl = CameraViewModel.instance.imageDir!.appendingPathComponent("\(PhotoCaptureDelegate.name)\(id).heic") let img = CIImage(cvPixelBuffer: photo.pixelBuffer!, options: [ .auxiliaryDepth: true, .properties: photo.metadata ]) let depthData = photo.depthData!.converting(toDepthDataType: kCVPixelFormatType_DepthFloat32) let colorSpace = CGColorSpace(name: CGColorSpace.sRGB) let fileData = CIContext().heifRepresentation(of: img, format: .RGBA8, colorSpace: colorSpace!, options: [ .avDepthData: depthData ]) try? fileData!.write(to: fileUrl, options: .atomic) } } But, Photogrammetry session spits warning messages: Sample 0 missing LiDAR point cloud! Sample 1 missing LiDAR point cloud! Sample 2 missing LiDAR point cloud! Sample 3 missing LiDAR point cloud! Sample 4 missing LiDAR point cloud! Sample 5 missing LiDAR point cloud! Sample 6 missing LiDAR point cloud! Sample 7 missing LiDAR point cloud! Sample 8 missing LiDAR point cloud! Sample 9 missing LiDAR point cloud! Sample 10 missing LiDAR point cloud! The session creates a usdz 3d model but scale is not correct. I think the point cloud can help Photogrammetry session to find right scale, but I don't know how to attach point cloud.
3
2
1.2k
Oct ’24
USDZ files with camera can't be opened on IOS 18.2/iPadOS 18.2 correctly.
Hi experts, When I open a USDZ file which contains perspective cameras by "Files" app in IOS 18.2/iPadOS 18.2, I can't see anything. And when I open the USDZ file in IOS 18.1/iPadOS 18.1, it works well. On the other hand, when I open a USDZ file which contains orthographic cameras in IOS 18.1 or IOS 18.2, the scene is stuck. Could you help to solve these issues please? Thanks.
4
2
607
Dec ’24
Spatial Scene API for iOS Apps
As part of the WWDC25 Keynote, a technology was announced that can present 2D images as 3D spatial scenes. This announcement is supported by a Press Release. ...developers can use the Spatial Scene API to make their app experience even more immersive. Zillow is taking advantage of the API for their Zillow Immersive app, allowing users to see images of homes and apartments with the rich depth and dimension that spatial scenes offer. The feature also appears in the Photos App on iOS 26 Developer Beta 1. Tapping "Spatial Scene" on any photo opens a view of that photo with a parallax effect. I've searched the WWDC sessions and new documentation and have come up short. Reaching out here for help. Is there any documentation for Spatial Scene API? Or any guidance on how to implement the spatial scene in iOS?
1
2
230
Jun ’25
How to attach SwiftUI Views to entities on non-visionOS platforms?
What is the recommended way to attach SwiftUI views to RealityKit entities on macOS, iOS, etc? All the APIs seem to be visionOS only: https://developer.apple.com/documentation/realitykit/realityviewattachments https://developer.apple.com/documentation/realitykit/viewattachmentcomponent https://developer.apple.com/documentation/realitykit/presentationcomponent https://developer.apple.com/documentation/realitykit/imagepresentationcomponent My only idea is to do it "manually" with a ZStack and RealityView somehow? I submitted this as a feedback since it seemed like an oversight: FB18034856.
0
2
84
Jun ’25
Is migrating from ARView to RealityView recommended?
We're using RealityKit to create a science education AR app for iOS, iPadOS, and visionOS. In the WWDC25 session video "Bring your SceneKit project to RealityKit" https://developer.apple.com/videos/play/wwdc2025/288 at 8:15, it's explained that when using RealityKit, RealityView should be used in all cases, whereas in the past, SceneKit required SCNView, SceneView, or ARSCNView, depending on an app's requirements. Because the initial development of our app on iOS predates iOS 18's RealityView, our app currently uses ARView to render RealityKit AR content on iOS and iPadOS. Is it recommended that we migrate to RealityView, or can we safely continue using our existing ARView implementation? We'd prefer to avoid unnecessary development cost. If migrating from ARView to RealityView is recommended, what specific benefits should we expect from this transition? Thank you.
1
2
149
Jun ’25
Struggles with attaching a ModelEntity to the skeleton joints of another ModelEntity
In SceneKit, when creating an .scn file from a rigged model, the framework created an SCNNode for each bone/joint, so you could add and remove child nodes directly to and from joints, and like any other SCNNode, you could access world position and world orientation for each joint. The analog would be for joints to be accessible as child entities of a ModelEntity in RealityKit. I am unable to proceed with migrating my project from SceneKit because of this, as there does not seem to be a way to even access the true world position of a joint with the current jointNames/jointTransforms paradigm. The translation information from the given transforms is insufficient to determine the location of a joint at any given time, and other approaches like creating a GeometricPin for the given joint name and attaching it to another entity do not seem to work. So conveniently being able to attach an item to the hand of a rigged model was trivial in SceneKit and now feels impossible in RealityKit. I am not the first person to notice this, and am feeling demoralized about proceeding with RealityKit with such a critical piece of functionality blocked https://stackoverflow.com/questions/76726241/how-do-i-attach-an-entity-to-a-skeletons-joint-in-realitykit Will this be addressed in some way?
5
2
698
Jul ’25
PhotogrammetrySession fails with internal errors 4011 / 4012 when using iOS Object Capture (Area Mode) images
Hi all, I’m running into an issue when trying to reconstruct a 3D model using PhotogrammetrySession on macOS from a set of images captured via the iOS Object Capture sample app, specifically in Area mode. When I attempt to create the model from these images (using the raw Images/ folder exported directly from the capture session), I get the following errors: ERROR cv3dapi.pg: Internal error codes (2): 4011 4012 WARN cv3dapi.pg: Internal warning codes (1): 4507 Output error with code = -15 requestError: CoreOC.PhotogrammetrySession.Error.processError I use the "Images" directory directly exported from Object Capture with my iphone 12 pro max (has lidar) set to "area mode" in the object capture app here is an example heic image metadata from the sequence. heif-info Images/00044.869568833.HEIC MIME type: image/heic main brand: heic compatible brands: mif1, MiHE, MiPr, miaf, MiHB, heic image: 3024x4032 (id=49), primary tiles: 6x8, tile size: 512x512 colorspace: YCbCr, 4:2:0 bit depth: 8 thumbnail: 240x320 color profile: nclx alpha channel: no depth channel: yes size: 192x256 bits per pixel: 8 z-near: 1.173828 z-far: 2.552734 d-min: undefined d-max: undefined representation: uniform Z metadata: Exif: 960 bytes uri /tag:apple.com,2023:ObjectCapture#CameraTrackingState: 4 bytes uri /tag:apple.com,2023:ObjectCapture#CameraCalibrationData: 1015 bytes uri /tag:apple.com,2023:ObjectCapture#ObjectTransform: 48 bytes uri /tag:apple.com,2023:ObjectCapture#ObjectBoundingBox: 48 bytes uri /tag:apple.com,2023:ObjectCapture#RawFeaturePoints: 832 bytes uri /tag:apple.com,2023:ObjectCapture#PointCloudData: 23984 bytes uri /tag:apple.com,2023:ObjectCapture#BundleVersion: 5 bytes uri /tag:apple.com,2023:ObjectCapture#SegmentID: 4 bytes uri /tag:apple.com,2024:ObjectCapture#SessionUUID: 36 bytes uri /tag:apple.com,2024:ObjectCapture#CaptureMode: 4 bytes uri /tag:apple.com,2023:ObjectCapture#Feedback: 4 bytes uri /tag:apple.com,2023:ObjectCapture#WideToDepthCameraTransform: 48 bytes uri /tag:apple.com,2023:ObjectCapture#TemporalDepthPointClouds: 864026 bytes transformations: angle (ccw): 270 region annotations: none properties: camera intrinsic matrix: focal length: 2813.695557; 2813.695557 principal point: 1522.338502; 2002.843018 skew: 0.000000 camera extrinsic matrix: rotation matrix: -0.695 0.344 -0.632 0.007 -0.875 -0.483 -0.719 -0.340 0.606 Questions: • What do internal error codes 4011 and 4012 refer to? • Is there something specific about Area mode captures that require preprocessing before they’re compatible with PhotogrammetrySession? • Has anyone successfully reconstructed a model from an Area mode session using the stock Apple tools? NOTE: I can provide the folder of images for debugging if that would help!
1
2
871
Jul ’25
Apple's Choice: USDZ over Other 3D File Formats like GLTF
Hello Dev Community, I've been thinking over Apple's preference for USDZ for AR and 3D content, especially when there's the widely used GLTF. I'm keen to discuss and hear your insights on this choice. USDZ, backed by Apple, has seen a surge in the AR community. It boasts advantages like compactness, animation support, and ARKit compatibility. In contrast, GLTF too is a popular format with its own merits, like being an open standard and offering flexibility. Here are some of my questions toward the use of USDZ: Why did Apple choose USDZ over other 3D file formats like GLTF? What benefits does USDZ bring to Apple's AR and 3D content ecosystem? Are there any limitations of USDZ compared to other file formats? Could factors like compatibility, security, or integration ease have influenced Apple's decision? I would love to hear your thoughts on this. Feel free to share any experiences with USDZ or other 3D file formats within Apple's ecosystem!
4
1
6k
Oct ’24
How can I simultaneously apply the drag gesture to multiple entities?
I wanted to drag EntityA while also dragging EntityB independently. I've tried to separate them by entity but it only recognizes the latest drag gesture RealityView { content, attachments in ... } .gesture( DragGesture() .targetedToEntity(EntityA) .onChanged { value in ... } ) .gesture( DragGesture() .targetedToEntity(EntityB) .onChanged { value in ... } ) also tried using the simultaneously but didn't work too, maybe i'm missing something .gesture( DragGesture() .targetedToEntity(EntityA) .onChanged { value in ... } .simultaneously(with: DragGesture() .targetedToEntity(EntityB) .onChanged { value in ... } )
1
1
620
Mar ’25
Disable Foveation for ImmersiveSpace?
Does anyone know how I can disable foveation for an ImmersiveSpace? I'm aware that I could use a CompositorLayer and my own Metal rendering to control foveation, but I'm hoping that I can configure an existing/underlying LayerRenderer (or similar) to disable it for an immersive scene. Or if there's another approach I should be taking, any pointers are appreciated. Thank you!
1
1
588
Dec ’24
Game Center breaks RealityView world tracking
Has anyone come across the issue that setting GKLocalPlayer.local.authenticateHandler breaks a RealityView's world tracking on iOS / iPadOS 18 beta 5? I'm in the process of upgrading my app to make use of the much appreciated RealityView unification, using RealityView not only on visionOS but now also on iOS and iPadOS. In my RealityView, I enable world tracking on iOS like this: content.camera = .worldTracking However, device position and orientation were ignored (the camera remained static) and there was no camera pass-through. Then I discovered that the issue disappeared when I remove the line GKLocalPlayer.local.authenticateHandler = { viewController, error in // ... some more code ... } So I filed FB14731139 and hope that it will be resolved before the release of iOS / iPadOS 18.
5
1
841
Mar ’25
GCControllerDidConnect notification not received in VisionOS 2.0
I am unable to get VisionOS 2.0 (simulator) to receive the GCControllerDidConnect notification and thus am unable to setup support for a gamepad. However, it works in VisionOS 1.2. For VisionOS 2.0 I've tried adding: .handlesGameControllerEvents(matching: .gamepad) attribute to the view Supports Controller User Interaction to Info.plist Supported game controller types -> Extended Gamepad to Info.plist ...but the notification still doesn't fire. It does when the code is run from VisionOS 1.2 simulator, both of which have the Send Game Controller To Device option enabled. Here is the example code. It's based on the Xcode project template. The only files updated were ImmersiveView.swift and Info.plist, as detailed above: import SwiftUI import GameController import RealityKit import RealityKitContent struct ImmersiveView: View { var body: some View { RealityView { content in // Add the initial RealityKit content if let immersiveContentEntity = try? await Entity(named: "Immersive", in: realityKitContentBundle) { content.add(immersiveContentEntity) } NotificationCenter.default.addObserver( forName: NSNotification.Name.GCControllerDidConnect, object: nil, queue: nil) { _ in print("Handling GCControllerDidConnect notification") } } .modify { if #available(visionOS 2.0, *) { $0.handlesGameControllerEvents(matching: .gamepad) } else { $0 } } } } extension View { func modify<T: View>(@ViewBuilder _ modifier: (Self) -> T) -> some View { return modifier(self) } }
2
1
867
Dec ’24
Are these RealityKit warnings a cause for concern?
I've created a minimal iOS RealityKit/ARKit app with only this view controller: import UIKit import RealityKit import ARKit class ViewController: UIViewController { var arView: ARView! override func viewDidLoad() { super.viewDidLoad() arView = ARView(frame: view.bounds, cameraMode: .nonAR, automaticallyConfigureSession: true) arView.autoresizingMask = [.flexibleWidth, .flexibleHeight] view.addSubview(arView) } } When I run this app in the iOS 18.1 simulator using Xcode 16.1 beta 2 (16B5014f), RealityKit logs the warnings included below to the console. I see similar warnings on the device, running iOS 18.0. Should I be concerned about any of these warnings? Please let me know if I should submit feedback reporting this issue. Thank you. Could not locate file 'default-binaryarchive.metallib' in bundle. Registering library (/Library/Developer/CoreSimulator/Volumes/iOS_22B5045f/Library/Developer/CoreSimulator/Profiles/Runtimes/iOS 18.1.simruntime/Contents/Resources/RuntimeRoot/System/Library/PrivateFrameworks/CoreRE.framework/default.metallib) that already exists in shader manager. Library will be overwritten. Could not resolve material name 'engine:BuiltinRenderGraphResources/AR/suFeatheringCreateMergedOcclusionMask.rematerial' in bundle at '/Users/drew/Library/Developer/CoreSimulator/Devices/6BCC578D-5046-49E6-B149-390576C7241D/data/Containers/Bundle/Application/75E15C63-F1D2-487F-A057-EBA2D45582C3/ARViewFailApp.app'. Loading via asset path. Could not resolve material name 'engine:BuiltinRenderGraphResources/AR/arKitPassthrough.rematerial' in bundle at '/Users/drew/Library/Developer/CoreSimulator/Devices/6BCC578D-5046-49E6-B149-390576C7241D/data/Containers/Bundle/Application/75E15C63-F1D2-487F-A057-EBA2D45582C3/ARViewFailApp.app'. Loading via asset path. Could not resolve material name 'engine:BuiltinRenderGraphResources/AR/arSegmentationComposite.rematerial' in bundle at '/Users/drew/Library/Developer/CoreSimulator/Devices/6BCC578D-5046-49E6-B149-390576C7241D/data/Containers/Bundle/Application/75E15C63-F1D2-487F-A057-EBA2D45582C3/ARViewFailApp.app'. Loading via asset path. Could not resolve material name 'engine:BuiltinRenderGraphResources/AR/arInPlacePostProcessCombinedPermute0.rematerial' in bundle at '/Users/drew/Library/Developer/CoreSimulator/Devices/6BCC578D-5046-49E6-B149-390576C7241D/data/Containers/Bundle/Application/75E15C63-F1D2-487F-A057-EBA2D45582C3/ARViewFailApp.app'. Loading via asset path. Could not resolve material name 'engine:BuiltinRenderGraphResources/AR/arInPlacePostProcessCombinedPermute1.rematerial' in bundle at '/Users/drew/Library/Developer/CoreSimulator/Devices/6BCC578D-5046-49E6-B149-390576C7241D/data/Containers/Bundle/Application/75E15C63-F1D2-487F-A057-EBA2D45582C3/ARViewFailApp.app'. Loading via asset path. Could not resolve material name 'engine:BuiltinRenderGraphResources/AR/arInPlacePostProcessCombinedPermute2.rematerial' in bundle at '/Users/drew/Library/Developer/CoreSimulator/Devices/6BCC578D-5046-49E6-B149-390576C7241D/data/Containers/Bundle/Application/75E15C63-F1D2-487F-A057-EBA2D45582C3/ARViewFailApp.app'. Loading via asset path. Could not resolve material name 'engine:BuiltinRenderGraphResources/AR/arInPlacePostProcessCombinedPermute3.rematerial' in bundle at '/Users/drew/Library/Developer/CoreSimulator/Devices/6BCC578D-5046-49E6-B149-390576C7241D/data/Containers/Bundle/Application/75E15C63-F1D2-487F-A057-EBA2D45582C3/ARViewFailApp.app'. Loading via asset path. Could not resolve material name 'engine:BuiltinRenderGraphResources/AR/arInPlacePostProcessCombinedPermute4.rematerial' in bundle at '/Users/drew/Library/Developer/CoreSimulator/Devices/6BCC578D-5046-49E6-B149-390576C7241D/data/Containers/Bundle/Application/75E15C63-F1D2-487F-A057-EBA2D45582C3/ARViewFailApp.app'. Loading via asset path. Could not resolve material name 'engine:BuiltinRenderGraphResources/AR/arInPlacePostProcessCombinedPermute5.rematerial' in bundle at '/Users/drew/Library/Developer/CoreSimulator/Devices/6BCC578D-5046-49E6-B149-390576C7241D/data/Containers/Bundle/Application/75E15C63-F1D2-487F-A057-EBA2D45582C3/ARViewFailApp.app'. Loading via asset path. Could not resolve material name 'engine:BuiltinRenderGraphResources/AR/arInPlacePostProcessCombinedPermute6.rematerial' in bundle at '/Users/drew/Library/Developer/CoreSimulator/Devices/6BCC578D-5046-49E6-B149-390576C7241D/data/Containers/Bundle/Application/75E15C63-F1D2-487F-A057-EBA2D45582C3/ARViewFailApp.app'. Loading via asset path. Could not resolve material name 'engine:BuiltinRenderGraphResources/AR/arInPlacePostProcessCombinedPermute7.rematerial' in bundle at '/Users/drew/Library/Developer/CoreSimulator/Devices/6BCC578D-5046-49E6-B149-390576C7241D/data/Containers/Bundle/Application/75E15C63-F1D2-487F-A057-EBA2D45582C3/ARViewFailApp.app'. Loading via asset path. Could not resolve material name 'engine:BuiltinRenderGraphResources/AR/arInPlacePostProcessCombinedPermute8.rematerial' in bundle at '/Users/drew/Library/Developer/CoreSimulator/Devices/6BCC578D-5046-49E6-B149-390576C7241D/data/Containers/Bundle/Application/75E15C63-F1D2-487F-A057-EBA2D45582C3/ARViewFailApp.app'. Loading via asset path. Could not resolve material name 'engine:BuiltinRenderGraphResources/AR/arInPlacePostProcessCombinedPermute9.rematerial' in bundle at '/Users/drew/Library/Developer/CoreSimulator/Devices/6BCC578D-5046-49E6-B149-390576C7241D/data/Containers/Bundle/Application/75E15C63-F1D2-487F-A057-EBA2D45582C3/ARViewFailApp.app'. Loading via asset path. Could not resolve material name 'engine:BuiltinRenderGraphResources/AR/arInPlacePostProcessCombinedPermute10.rematerial' in bundle at '/Users/drew/Library/Developer/CoreSimulator/Devices/6BCC578D-5046-49E6-B149-390576C7241D/data/Containers/Bundle/Application/75E15C63-F1D2-487F-A057-EBA2D45582C3/ARViewFailApp.app'. Loading via asset path. Could not resolve material name 'engine:BuiltinRenderGraphResources/AR/arInPlacePostProcessCombinedPermute11.rematerial' in bundle at '/Users/drew/Library/Developer/CoreSimulator/Devices/6BCC578D-5046-49E6-B149-390576C7241D/data/Containers/Bundle/Application/75E15C63-F1D2-487F-A057-EBA2D45582C3/ARViewFailApp.app'. Loading via asset path. Could not resolve material name 'engine:BuiltinRenderGraphResources/AR/arInPlacePostProcessCombinedPermute12.rematerial' in bundle at '/Users/drew/Library/Developer/CoreSimulator/Devices/6BCC578D-5046-49E6-B149-390576C7241D/data/Containers/Bundle/Application/75E15C63-F1D2-487F-A057-EBA2D45582C3/ARViewFailApp.app'. Loading via asset path. ... Compiler failed to build request makeRenderPipelineState failed [reading from a rendertarget is not supported]. Pipeline for technique meshShadowCasterProgrammableBlending failed compilation!
1
1
1.1k
Oct ’24