RealityKit

RSS for tag

Simulate and render 3D content for use in your augmented reality apps using RealityKit.

RealityKit Documentation

Post

Replies

Boosts

Views

Activity

Loading a lighting resource for a RealityKit view on macOS - image not embedded in app bundle
I'm trying to load up a virtual skybox, different from the built-in default, for a simple macOS rendering of RealityKit content. I was following the detail at https://developer.apple.com/documentation/realitykit/environmentresource, and created a folder called "light.skybox" with a single file in it ("prairie.hdr"), and then I'm trying to load that and set it as the environment on the arView when it's created: let ar = ARView(frame: .zero) do { let resource = try EnvironmentResource.load(named: "prairie") ar.environment.lighting.resource = resource } catch { print("Unable to load resource: \(error)") } The loading always fails when I launch the sample app, reporting "Unable to load resource ..." and when I look in the App bundle, the resource that's included there as Contents/Resources/light.realityenv is an entirely different size - appearing to be the default lighting. I've tried making the folder "light.skybox" explicitly target the app bundle for inclusion, but I don't see it get embedded with it toggle that way, or just default. Is there anything I need to do to get Xcode to process and include the lighting I'm providing? (This is inspired from https://stackoverflow.com/questions/77332150/realitykit-how-to-disable-default-lighting-in-nonar-arview, which shows an example for UIKit)
4
0
641
Aug ’24
Rounding errors in last row of entity transform
I'm trying to clone an entity that's somewhere deeper in hierarchy and I want it together with transform that takes into account parents. Initially I made something that would go back through parents, get their transforms and then reduce them to single one. Then I realized that what I'm doing is same as .transformMatrix(relativeTo: rootEntity), but to validate that what I made gives same results I started to print them both and I noticed that for some reason the last row instead of stable (0,0,0,1) is sometimes (0,0,0,0.9999...). I know that there are rounding errors, but I'd assume that 0 and 1 are "magical" in FP world. The only way I can try to explain it, is that .transformMatrix is using some fancy accelerated matrix multiplication and those produce some bigger rounding errors. That would explain slight differences in other fields between my version and function call, but still - the 1 seems weird. Here's function I'm using to compare: func cloneFlattened(entity: Entity, withChildren recursive: Bool) -> Entity { let clone = entity.clone(recursive: recursive) var transforms = [entity.transform.matrix] var parent: Entity? = entity.parent var rootEntity: Entity = entity while parent != nil { rootEntity = parent! transforms.append(parent!.transform.matrix) parent = parent!.parent } if transforms.count > 1 { clone.transform.matrix = transforms.reversed().reduce(simd_diagonal_matrix(simd_float4(repeating: 1)), *) print("QWE CLONE FLATTENED: \(clone.transform.matrix)") print("QWE CLONE RELATIVE : \(entity.transformMatrix(relativeTo: rootEntity))") } else { print("QWE CLONE SINGLE : \(clone.transform.matrix)") } return clone } Sometimes last one is not 1 QWE CLONE FLATTENED: [ [0.00042261832, 0.0009063079, 0.0, 0.0], [-0.0009063079, 0.00042261832, 0.0, 0.0], [0.0, 0.0, 0.0010000002, 0.0], [-0.0013045187, -0.009559666, -0.04027118, 1.0] ] QWE CLONE RELATIVE : [ [0.00042261826, 0.0009063076, -4.681872e-12, 0.0], [-0.0009063076, 0.00042261826, 3.580335e-12, 0.0], [3.4256328e-12, 1.8047965e-13, 0.0009999998, 0.0], [-0.0013045263, -0.009559661, -0.040271178, 0.9999997] ] Sometimes it is QWE CLONE FLATTENED: [ [0.0009980977, -6.1623556e-05, -1.7382005e-06, 0.0], [-6.136851e-05, -0.0009958588, 6.707259e-05, 0.0], [-5.8642554e-06, -6.683835e-05, -0.0009977464, 0.0], [-1.761913e-06, -0.002, 0.0, 1.0] ] QWE CLONE RELATIVE : [ [0.0009980979, -6.1623556e-05, -1.7382023e-06, 0.0], [-6.136855e-05, -0.0009958589, 6.707254e-05, 0.0], [-5.864262e-06, -6.6838256e-05, -0.0009977465, 0.0], [-1.758337e-06, -0.0019999966, -3.7252903e-09, 1.0] ] 0s in last row seem to be stable. It happens both for entities that are few levels deep and those that have only anchor as parent. So far I've never seen any value that would not be "technically a 1", but my hierarchies are not very deep and it makes me wonder if this rounding could get worse. Or is it just me doing something stupid? :)
0
0
337
Aug ’24
ModelEntity with skeletal pose is missing pins
Starting with Xcode Beta 4+, any ModelEntity I load from usdz that contain a skeletal pose has no pins. The pins used to be accessible from a ModelEntity so you could use alignment with other pins. Per the documentation, any ModelEntity with a skeletal pose should have pins that are automatically generated and contained on the entity.pins object itself. https://developer.apple.com/documentation/RealityKit/Entity/pins Is this a bug with the later Xcode betas or is the documentation wrong?
0
0
316
Aug ’24
Taking snapshots from WKWebView on visionos
Hi, I'm trying to capture some images from WKWebView on visionOS. I know that there's a function 'takeSnapshot()' that can get the image from the web page. But I wonder if 'drawHierarchy()' cannot work properly on WKWebView because of GPU content, is there any other methods I can call to capture images correctly? Furthermore, as I put my webview into an immersive space, is there any way I can get the texture of this UIView attachment? Thank you
0
0
348
Aug ’24
The elements in the attachment cannot add translation.
UI: Attachment(id: "tooptip") { if isRecording { TooltipView { HStack(spacing: 8) { Image(systemName: "waveform") .font(.title) .frame(minWidth: 100) } } .transition(.opacity.combined(with: .scale)) } } Trigger: Button("Toggle") { withAnimation{ isRecording.toggle() } } The above code did not show the animation effect when running. When I use isRecording to drive an element in a common SwiftUI view, there is an animation effect.
0
0
375
Aug ’24
Pink Screen with VideoMaterial in ARKit
Hi everyone, I'm developing an ARKit app using RealityKit and encountering an issue where a video displayed on a 3D plane shows up as a pink screen instead of the actual video content. Here's a simplified version of my setup: func createVideoScreen(video: AVPlayerItem, canvasWidth: Float, canvasHeight: Float, aspectRatio: Float, fitsWidth: Bool = true) -> ModelEntity { let width = (fitsWidth) ? canvasWidth : canvasHeight * aspectRatio let height = (fitsWidth) ? canvasWidth * (1/aspectRatio) : canvasHeight let screenPlane = MeshResource.generatePlane(width: width, depth: height) let videoMaterial: Material = createVideoMaterial(videoItem: video) let videoScreenModel = ModelEntity(mesh: screenPlane, materials: [videoMaterial]) return videoScreenModel } func createVideoMaterial(videoItem: AVPlayerItem) -> VideoMaterial { let player = AVPlayer(playerItem: videoItem) let videoMaterial = VideoMaterial(avPlayer: player) player.play() return videoMaterial } Despite following the standard process, the video plane renders pink. Has anyone encountered this before, or does anyone know what might be causing it? Thanks in advance!
6
1
615
Aug ’24
RealityKit 3D texture
In my Metal-based app, I ray-march a 3D texture. I'd like to use RealityKit instead of my own code. I see there is a LowLevelTexture (beta) where I could specify a 3D texture. However on the Metal side, there doesn't seem to be any way to access a 3D texture (realitykit::texture::textures::custom returns a texture2d). Any work-arounds? Could I even do something icky like cast the texture2d to a texture3d in MSL? (is that even possible?) Could I encode the 3d texture into an argument buffer and get that in somehow?
1
0
529
Aug ’24
Can an SDF be rendered using RealityKit?
I'm trying to ray-march an SDF inside a RealityKit surface shader. For the SDF primitive to correctly render with other primitives, the depth of the fragment needs to be set according to the ray-surface intersection point. Is there a way to do that within a RealityKit surface shader? It seems the only values I can set are within surface::surface_properties. If not, can an SDF still be rendered in RealityKit using ray-marching?
1
1
561
Sep ’24
VisionOS crashes Loading Entities from Disk Bundles with EXC_BREAKPOINT
Summary: I’m working on a VisionOS project where I need to dynamically load a .bundle file containing RealityKit content from the app’s Application Support directory. The .bundle is saved to disk after being downloaded or retrieved as an On-Demand Resource (ODR). Sample project with the issue: Github repo. Play the target test-odr to use with the local bundle and have the crash. Overall problem: Setup: Add a .bundle named RealityKitContent_RealityKitContent.bundle to the app’s resources. This bundle contains a Reality file with two USDA,: “Immersive” and “Scene”. Save to Disk: save the bundle to the Application Support directory, ensuring that the file is correctly copied and saved. Load the Bundle: load the bundle from the saved URL using Bundle(url: bundleURL) to initialize the Bundle object. Load Entity from Bundle: load a specific entity (“Scene”) from the bundle. When trying to load the entity using let storedEntity = try await Entity(named: "Scene", in: bundle), the app crashes with an EXC_BREAKPOINT error. ContentsOf Method Issue: If I use the Entity.load(contentsOf:realityFileURL, withName: entityName) method, it always loads the first root entity found (in this case, “Immersive”) rather than “Scene”, even when specifying the entity name. This is why I want to use the Bundle to load entities by name more precisely. Issue: The crash consistently occurs on the Entity(named: "Scene", in: bundle) line. I have verified that the bundle exists and is accessible at the specified path and that it contains the expected .reality file with multiple entities (“Immersive” and “Scene”). The error code I get is EXC_BREAKPOINT (code=1, subcode=0x1d135d4d0). What I’ve Tried: • Ensured the bundle is properly saved and accessible. • Checked that the bundle is initialized correctly from the URL. • Tested loading the entity using the contentsOf method, which works fine but always loads the “Immersive” entity, ignoring the specified name. Hence, I want to use the Bundle-based approach to load multiple USDA entities selectively. Question: Has anyone faced a similar issue or knows why loading entities using Entity(named:in:) from a disk-based bundle causes this crash? Any advice on how to debug or resolve this, especially for managing multiple root entities in a .reality file, would be greatly appreciated.
1
0
483
Sep ’24
Getting vector3 or vector4 per-vertex data into Shader Graph Editor
Hi, I'm trying to understand how I can get 3- or 4-channel per-vertex data into the Graph Editor. From my tests, it seems that: the "Geometric Property" node does not give access to 4-channel data, "Geometric Property (vector3)" does not give me access to custom properties besides the ones defined in MaterialX core the "Texture Coordinates" node has a vector4f mode (yay!), but according to MaterialX spec, texcoords must have 2 or 3 channels, and I can't get 4-channel data to show up there either. My assumption so far is that I must be missing some "magic" – for example, do the primvars in a file have to be in a specific order, independent of their names? Or do their names matter? (E.g. convention would be primars:st and primvars:st1 and so on) Unfortunately the forum doesn't allow me to attach any USDZ or ZIP files or GDrive links; if there's a way to share a test file I'm happy to do so!
0
0
389
Sep ’24
sceneUnderstanding occlusion not work with blendMode = alpha in ios 18
If I import a USDZ model with blendMode set to alpha, occlusion does not work on iPhone with iOS 18. How should transparent materials and occlusion be properly used in the new RealityKit? Additionally, new artifacts have appeared when working with transparent objects overlapping each other. The transparency results do not blend but rather parts of the model just not rendering.
6
0
798
Sep ’24
Exception while creating a PhotogrammetrySession object with PhotogrammetrySamples
Hi, I am trying to use PhotogrammetrySample input sequence according to the WWDC video. I modified only the following code in the sample: var images = try FileManager.default.contentsOfDirectory(at: captureFolderManager.imagesFolder, includingPropertiesForKeys: nil, options: .skipsHiddenFiles) let inputSequence = images.lazy.compactMap { file in return self.loadSampleAndMask(file: file) } // Next line causes the exception photogrammetrySession = try PhotogrammetrySession( input: inputSequence, configuration: configuration) private func loadSampleAndMask(file: URL) -> PhotogrammetrySample? { do { var sample = try PhotogrammetrySample(contentsOf: file) return sample } catch { return nil } } I am getting following runtime error: Thread 1: EXC_BREAKPOINT (code=1, subcode=0x240d88904)
0
0
338
Sep ’24
Znear For RealityView to prevent Entity from Hiding an attachment SwiftUI Button in VisionOS
I have an app which have an Immersive Space view and it needs the user to have a button in the bottom which have a fixed place in front of the user head like a dashboard in game or so but when the user get too close to any3d object in the view it could cover the button and make it inaccessible and it mainly would prevent the app for being approved like that in appstoreconnect I was working before on SceneKit and there was something like camera view Znear and Zfar which decide when to hide the 3d model if it comes too close or gets too far and I wonder if there is something like that in realityView / RealityKit 4. Here is My Code and the screenshots follows import SwiftUI import RealityKit struct ContentView: View { @State var myHead: Entity = { let headAnchor = AnchorEntity(.head) headAnchor.position = [-0.02, -0.023, -0.24] return headAnchor }() @State var clicked = false var body: some View { RealityView { content, attachments in // create a 3d box let mainBox = ModelEntity(mesh: .generateBox(size: [0.1, 0.1, 0.1])) mainBox.position = [0, 1.6, -0.3] content.add(mainBox) content.add(myHead) guard let attachmentEntity = attachments.entity(for: "Dashboard") else {return} myHead.addChild(attachmentEntity) } attachments: { // SwiftUI Inside Immersivre View Attachment(id: "Dashboard") { VStack { Spacer() .frame(height: 300) Button(action: { goClicked() }) { Text(clicked ? "⏸️" : "▶️") .frame(maxWidth: 48, maxHeight: 48, alignment: .center) .font(.extraLargeTitle) } .buttonStyle(.plain) } } } } func goClicked() { clicked.toggle() } }
2
0
705
Sep ’24
USDZ in Keynote
Apple's own USDZ files from their website does not display properly both in OPReview and in Keynote when imported. It displays properly, however, in Reality Converter bit with this error: "Invalid USD shader node in USD file Shader nodes must have “id” as the implementationSource, with id values that begin with “Usd”. Also, shader inputs with connections must each have a single, valid connection source." I tried importing other models from external sources and they work without any issue at all. Is there any potential fix or workaround this? Thanks in advance.
1
0
422
Sep ’24
GCControllerDidConnect notification not received in VisionOS 2.0
I am unable to get VisionOS 2.0 (simulator) to receive the GCControllerDidConnect notification and thus am unable to setup support for a gamepad. However, it works in VisionOS 1.2. For VisionOS 2.0 I've tried adding: .handlesGameControllerEvents(matching: .gamepad) attribute to the view Supports Controller User Interaction to Info.plist Supported game controller types -> Extended Gamepad to Info.plist ...but the notification still doesn't fire. It does when the code is run from VisionOS 1.2 simulator, both of which have the Send Game Controller To Device option enabled. Here is the example code. It's based on the Xcode project template. The only files updated were ImmersiveView.swift and Info.plist, as detailed above: import SwiftUI import GameController import RealityKit import RealityKitContent struct ImmersiveView: View { var body: some View { RealityView { content in // Add the initial RealityKit content if let immersiveContentEntity = try? await Entity(named: "Immersive", in: realityKitContentBundle) { content.add(immersiveContentEntity) } NotificationCenter.default.addObserver( forName: NSNotification.Name.GCControllerDidConnect, object: nil, queue: nil) { _ in print("Handling GCControllerDidConnect notification") } } .modify { if #available(visionOS 2.0, *) { $0.handlesGameControllerEvents(matching: .gamepad) } else { $0 } } } } extension View { func modify<T: View>(@ViewBuilder _ modifier: (Self) -> T) -> some View { return modifier(self) } }
2
1
507
Sep ’24
Change zFar in RealityView
I am working on a RealityView on iOS 18 that needs to render objects farther away than 1,000 meters. My app is used outside in open areas. I am using RealityView with content.camera = .spatialTracking and I have turned off occlusion, collisions, and plane detection with a simple scene understanding like this. let configuration = SpatialTrackingSession.Configuration( tracking: [.camera], sceneUnderstanding: [], //We don't want occlusions, collisions, etc camera: .back) let session = SpatialTrackingSession() if let unavailable = await session.run(configuration) { print("unavailable \(unavailable)") } Is this possible with spatialTracking with RealityView or with ARView? I have my RealityView working on visionOS inside an ImmersiveSpace. On visionOS I don't have the camera as a passthrough, it is virtual scene and it has wold tracking set up via the WorldTrackingProvider and I can render objects father away than 1000 meters. I would like to do the same thing on iOS. I don't need to have the camera pass through, but I do need to have the world tracking. I see that PerspectiveCameraComponent lets me set the near and far clipping planes, but I don't see how I can use that camera with world tracking.
0
0
405
Oct ’24
0.5 zoom on RealityKit back camera
My app uses RealityKit with an arView with World tracking and a scene construction mesh. Working well. As I understand it, the default camera selection using ARIkit, is ultra wide camera. However, in the camera app, there is a 0.5 option to increase the field of view further. Is there any way to enable this 0.5x option using code? Or any control over FOV using RealityKit and a mesh? Thanks!
2
0
504
Oct ’24
Are these RealityKit warnings a cause for concern?
I've created a minimal iOS RealityKit/ARKit app with only this view controller: import UIKit import RealityKit import ARKit class ViewController: UIViewController { var arView: ARView! override func viewDidLoad() { super.viewDidLoad() arView = ARView(frame: view.bounds, cameraMode: .nonAR, automaticallyConfigureSession: true) arView.autoresizingMask = [.flexibleWidth, .flexibleHeight] view.addSubview(arView) } } When I run this app in the iOS 18.1 simulator using Xcode 16.1 beta 2 (16B5014f), RealityKit logs the warnings included below to the console. I see similar warnings on the device, running iOS 18.0. Should I be concerned about any of these warnings? Please let me know if I should submit feedback reporting this issue. Thank you. Could not locate file 'default-binaryarchive.metallib' in bundle. Registering library (/Library/Developer/CoreSimulator/Volumes/iOS_22B5045f/Library/Developer/CoreSimulator/Profiles/Runtimes/iOS 18.1.simruntime/Contents/Resources/RuntimeRoot/System/Library/PrivateFrameworks/CoreRE.framework/default.metallib) that already exists in shader manager. Library will be overwritten. Could not resolve material name 'engine:BuiltinRenderGraphResources/AR/suFeatheringCreateMergedOcclusionMask.rematerial' in bundle at '/Users/drew/Library/Developer/CoreSimulator/Devices/6BCC578D-5046-49E6-B149-390576C7241D/data/Containers/Bundle/Application/75E15C63-F1D2-487F-A057-EBA2D45582C3/ARViewFailApp.app'. Loading via asset path. Could not resolve material name 'engine:BuiltinRenderGraphResources/AR/arKitPassthrough.rematerial' in bundle at '/Users/drew/Library/Developer/CoreSimulator/Devices/6BCC578D-5046-49E6-B149-390576C7241D/data/Containers/Bundle/Application/75E15C63-F1D2-487F-A057-EBA2D45582C3/ARViewFailApp.app'. Loading via asset path. Could not resolve material name 'engine:BuiltinRenderGraphResources/AR/arSegmentationComposite.rematerial' in bundle at '/Users/drew/Library/Developer/CoreSimulator/Devices/6BCC578D-5046-49E6-B149-390576C7241D/data/Containers/Bundle/Application/75E15C63-F1D2-487F-A057-EBA2D45582C3/ARViewFailApp.app'. Loading via asset path. Could not resolve material name 'engine:BuiltinRenderGraphResources/AR/arInPlacePostProcessCombinedPermute0.rematerial' in bundle at '/Users/drew/Library/Developer/CoreSimulator/Devices/6BCC578D-5046-49E6-B149-390576C7241D/data/Containers/Bundle/Application/75E15C63-F1D2-487F-A057-EBA2D45582C3/ARViewFailApp.app'. Loading via asset path. Could not resolve material name 'engine:BuiltinRenderGraphResources/AR/arInPlacePostProcessCombinedPermute1.rematerial' in bundle at '/Users/drew/Library/Developer/CoreSimulator/Devices/6BCC578D-5046-49E6-B149-390576C7241D/data/Containers/Bundle/Application/75E15C63-F1D2-487F-A057-EBA2D45582C3/ARViewFailApp.app'. Loading via asset path. Could not resolve material name 'engine:BuiltinRenderGraphResources/AR/arInPlacePostProcessCombinedPermute2.rematerial' in bundle at '/Users/drew/Library/Developer/CoreSimulator/Devices/6BCC578D-5046-49E6-B149-390576C7241D/data/Containers/Bundle/Application/75E15C63-F1D2-487F-A057-EBA2D45582C3/ARViewFailApp.app'. Loading via asset path. Could not resolve material name 'engine:BuiltinRenderGraphResources/AR/arInPlacePostProcessCombinedPermute3.rematerial' in bundle at '/Users/drew/Library/Developer/CoreSimulator/Devices/6BCC578D-5046-49E6-B149-390576C7241D/data/Containers/Bundle/Application/75E15C63-F1D2-487F-A057-EBA2D45582C3/ARViewFailApp.app'. Loading via asset path. Could not resolve material name 'engine:BuiltinRenderGraphResources/AR/arInPlacePostProcessCombinedPermute4.rematerial' in bundle at '/Users/drew/Library/Developer/CoreSimulator/Devices/6BCC578D-5046-49E6-B149-390576C7241D/data/Containers/Bundle/Application/75E15C63-F1D2-487F-A057-EBA2D45582C3/ARViewFailApp.app'. Loading via asset path. Could not resolve material name 'engine:BuiltinRenderGraphResources/AR/arInPlacePostProcessCombinedPermute5.rematerial' in bundle at '/Users/drew/Library/Developer/CoreSimulator/Devices/6BCC578D-5046-49E6-B149-390576C7241D/data/Containers/Bundle/Application/75E15C63-F1D2-487F-A057-EBA2D45582C3/ARViewFailApp.app'. Loading via asset path. Could not resolve material name 'engine:BuiltinRenderGraphResources/AR/arInPlacePostProcessCombinedPermute6.rematerial' in bundle at '/Users/drew/Library/Developer/CoreSimulator/Devices/6BCC578D-5046-49E6-B149-390576C7241D/data/Containers/Bundle/Application/75E15C63-F1D2-487F-A057-EBA2D45582C3/ARViewFailApp.app'. Loading via asset path. Could not resolve material name 'engine:BuiltinRenderGraphResources/AR/arInPlacePostProcessCombinedPermute7.rematerial' in bundle at '/Users/drew/Library/Developer/CoreSimulator/Devices/6BCC578D-5046-49E6-B149-390576C7241D/data/Containers/Bundle/Application/75E15C63-F1D2-487F-A057-EBA2D45582C3/ARViewFailApp.app'. Loading via asset path. Could not resolve material name 'engine:BuiltinRenderGraphResources/AR/arInPlacePostProcessCombinedPermute8.rematerial' in bundle at '/Users/drew/Library/Developer/CoreSimulator/Devices/6BCC578D-5046-49E6-B149-390576C7241D/data/Containers/Bundle/Application/75E15C63-F1D2-487F-A057-EBA2D45582C3/ARViewFailApp.app'. Loading via asset path. Could not resolve material name 'engine:BuiltinRenderGraphResources/AR/arInPlacePostProcessCombinedPermute9.rematerial' in bundle at '/Users/drew/Library/Developer/CoreSimulator/Devices/6BCC578D-5046-49E6-B149-390576C7241D/data/Containers/Bundle/Application/75E15C63-F1D2-487F-A057-EBA2D45582C3/ARViewFailApp.app'. Loading via asset path. Could not resolve material name 'engine:BuiltinRenderGraphResources/AR/arInPlacePostProcessCombinedPermute10.rematerial' in bundle at '/Users/drew/Library/Developer/CoreSimulator/Devices/6BCC578D-5046-49E6-B149-390576C7241D/data/Containers/Bundle/Application/75E15C63-F1D2-487F-A057-EBA2D45582C3/ARViewFailApp.app'. Loading via asset path. Could not resolve material name 'engine:BuiltinRenderGraphResources/AR/arInPlacePostProcessCombinedPermute11.rematerial' in bundle at '/Users/drew/Library/Developer/CoreSimulator/Devices/6BCC578D-5046-49E6-B149-390576C7241D/data/Containers/Bundle/Application/75E15C63-F1D2-487F-A057-EBA2D45582C3/ARViewFailApp.app'. Loading via asset path. Could not resolve material name 'engine:BuiltinRenderGraphResources/AR/arInPlacePostProcessCombinedPermute12.rematerial' in bundle at '/Users/drew/Library/Developer/CoreSimulator/Devices/6BCC578D-5046-49E6-B149-390576C7241D/data/Containers/Bundle/Application/75E15C63-F1D2-487F-A057-EBA2D45582C3/ARViewFailApp.app'. Loading via asset path. ... Compiler failed to build request makeRenderPipelineState failed [reading from a rendertarget is not supported]. Pipeline for technique meshShadowCasterProgrammableBlending failed compilation!
1
0
518
Oct ’24
RealityKit crashes randomly in the simulator but not on the device
I'm writing a RealityKit/ARKit app that runs on iOS. Starting with Xcode 16.0 beta 1, at least through Xcode 16.1 beta 2 (16B5014f), in the iOS 18 simulator, my app randomly crashes in about 20% of app sessions the first time it attempts to present an ARView. The crashes seem to occur at multiple points within RealityKit and Metal. Below, I've included screenshots of the call stacks of the crashes, which occur as a result of both EXC_BAD_ACCESS and assertion failures within RealityKit. The app only crashes in the iOS 18 simulator, and does not crash in the iOS 17 simulator or earlier. The app only crashes in the simulator, and does not crash on a device running iOS 18. Before I investigate further, I'd appreciate it if an Apple engineer could give me a sense of if these crashes are most likely the result of known issues within RealityKit and/or the simulator, or if your opinion is that there are probably bugs in my app's code. I've submitted several feedback issues in the past, and I'd love to submit this issue too, but I expect that I would spend many hours attempting to create a repro case in a sample app. Understandably, I'd rather not spend this time if an Apple engineer could tell me this is a known issue, for example. Thank you.
5
0
564
Oct ’24