Delve into the world of graphics and game development. Discuss creating stunning visuals, optimizing game mechanics, and share resources for game developers.

All subtopics

Post

Replies

Boosts

Views

Activity

How to create screen-space meshes selectively in RealityKit AR Mode Using New OrthographicCameraComponent?
I'd like to create meshes in RealityKit ( AR mode on iPad ) in screen-space, i.e. for UI. I noticed a lot of useful new functionality in RealityKit for the next OS versions, including the OrthographicCameraComponent here: https://developer.apple.com/documentation/realitykit/orthographiccameracomponent?changes=_3 I think this would help, but I need AR worldtracking as well as a regular perspective camera to work with the 3D elements. Firstly, can I have a camera attached selectively to a few entities, just for those entities? This could be the orthographic camera. Secondly, can I make it so those entities are always rendered in-front, in screenspace? (They'd need to follow the camera.) If I can't have multiple cameras, what can be done in that case? Is it actually better to use a completely different view / API for layering on-top of RealityKit? I would much rather keep everything in RealityKit, however, for simplicity.
0
0
26
4h
Using a scene from Reality Composer Pro in an IOS app?
I am trying to establish a workflow with using Reality Composer Pro to make scenes - I am grey boxing a scene using primitives at the moment. I have set up a cube with a texture material and a simple animation to spin. I am confused as to what I should be loading. I have created what I think is a scene asset in the package for the Reality Composer Project. Here is a code snippet: struct ContentView: View { var body: some View { RealityView { content in do { let scene = try await ModelEntity(named: "HOF") content.add(scene) } catch { print("Error loading scene: \(error.localizedDescription)") } } } } Here is the project layout in Reality Composer Pro:
1
0
44
10h
SceneKit or RealityKit for Non-AR Game Development
Hi everyone, I'm choosing a framework for developing a game that doesn't involve augmented reality (AR) and I'm unsure whether to use SceneKit or RealityKit. I would like to hear from Apple engineers on this matter. Which of these frameworks is better suited for creating non-AR games? Additionally, I'd like to know if it's possible to disable AR in RealityKit using the updated RealityView? Thanks in advance for your insights and recommendations!
1
0
76
12h
Reality Composer Pro inline documentation comments
The WWDC24 video "Build a spatial drawing app with RealityKit" https://developer.apple.com/wwdc24/10104 at 12:04 includes a slide showing a Reality Composer Pro shader graph that features wonderful inline documentation comment boxes: Are shader graph inline comments a new feature that Reality Composer Pro supports? This would be extraordinarily useful, as complex shader graphs can be challenging to decipher. If so, how are inline shader graph comments created in Reality Composer Pro?
1
0
43
13h
copyFromBuffer offset and size working even when not multiple of 4
Hi, Reading the copyFromBuffer documentation states that on macOS, sourceOffset, destinationOffset, and size "needs to be a multiple of 4, but can be any value in iOS and tvOS". However, I have noticed that, at least on my M2 Max, this limitation does not seem to exist as there are no warnings and the copy works correctly regardless of the offset value. I'm curious to know if this is something that should still be avoided. Is the multiple of 4 limitation reserved for non Apple Silicon devices and that note can be ignored for Apple Silicon? I ask because I am a contributor to Metal.jl, and recently noticed that our tests pass even when copying using copyWithBuffer with offsets and sizes that are not multiples of 4. If that coul cause issues/correctness problems, we would need to fix that. Thank you. Christian
0
0
39
15h
Unable to create PhysicsJoint using Entity's Geometric Pin
Hello, I'm trying to attach one entity to another entity via the new PhysicsFixedJoint. I have a usdz that contains a skeletal pose which expose the joints as pins as desired. However the when I access the pin, it is returning a GeometricPin, instead of an EntityGeometricPin as you would expect. I can't use the returned GeometricPin to create the joint. Am I missing something? Shouldn't access the Entity's pins object return EntityGeometricPins instead of GeometricPin? Here is the code sample: var body: some View { RealityView { content in if let scene = try? await Entity(named: "Scene", in: untitledBundle) { content.add(scene) let attack = try! Entity.load(named: "Attack01_SingleSword") let anchor = scene.findEntity(named: "Root") anchor?.addChild(attack) let sword = try! Entity.load(named: "OHS08_Sword") anchor?.addChild(sword) if let swordEntity = findModelComponentEntity(entity: sword) { let swordPin = swordEntity.pins.set( named: "test", position: SIMD3<Float>.zero ) if let attackEntity = findModelComponentEntity(entity: attack) { let attackPin = attackEntity.pins["root/pelvis/spine_01/spine_02/spine_03/clavicle_r/upperarm_r/lowerarm_r/hand_r/weapon_r"]! // This is returning GeomtricPin instead of the EntityGeometricPin that the "pins" object contains let joint = PhysicsFixedJoint( pin0: swordPin, pin1: attackPin // This is a compile error since it is not an EntityGeometricPin type ) try! joint.addToSimulation() } } } } }
0
0
69
1d
Metal and Swift Concurrency
Hi, Introducing Swift Concurrency to my Metal app has been a bit challenging as Swift Concurrency is limited by the cooperative thread pool. GPU work is obviously not CPU bound and can block forward moving progress, especially when using waitUntilCompleted on the command buffer. For concurrent render work this has the potential of under utilizing the CPU and even creating dead locks. My question is, what is the Metal's teams general recommendation when it comes to concurrency? It seems to me that Dispatch or OperationQueues are still the preferred way for Metal bound tasks in order to gain maximum performance? To integrate with Swift Concurrency my idea is to use continuations that kick off render jobs via Dispatch or Queues? Would this be the best solution to bridge async tasks with Metal work? Thanks!
4
0
106
1d
USDZ with vertex color
Hello, I have a USDC file with vertex color (WITHOUT textures), and it displays perfectly in Preview. If I package it in a zip (without compression) and rename the resulting file to USDZ, I can see it without any issues in AVP and Mac. However, if I send it to an iPhone, the vertex color does not display. Is there anything else I need to do besides packaging the USDC without compression in a ZIP? Thank you very much.
0
0
61
1d
Bug? Xcode 16 macOS 15 SDK on macos 14.5 causes Metal Shader Colors to be Wrong
I've been upgrading Xcode consistently for years and have never seen Metal shaders behave differently from one version to another until now. On macOS 14.5, Xcode 16 beta, suddenly several color outputs turn out completely black where there should be color. All validation is on and nothing seems to be wrong (and hasn't been since maybe Xcode version 11). I've attached two screens. The first is the normal color scheme, the second is in Xcode 16. The settings are the exact same. Normal: Buggy with black + transparent colors (so it seems like either colors are overflowing or are all 0s)? Before I file a bug report or code level request, may I have some thoughts on how to debug this? The only clue I have is that I'm using bindless to multiply color texture samples with color values from my vertex struct. But it still fails even if I use hard-coded values for the texture samples, meaning somehow the color values are not being sent to the shader correctly? This is the most stable part of my rendering pipeline, so I'm surprised if the issue is there. Thank you.
1
0
171
3d
Turn on ARView.ARSession.ARConfiguration.providesAudioData = true and add a ModelEntity to ARView (its Material is VideoMaterial (avPlayer: player) this video contains audio), and the video does not play properly?
import SwiftUI import RealityKit import ARKit import AVFoundation struct ContentView : View { var body: some View { ARViewContainer().edgesIgnoringSafeArea(.all) } } struct ARViewContainer: UIViewRepresentable { func makeUIView(context: Context) -> ARView { let arView = ARView(frame: .zero) arView.session.delegate = context.coordinator let worldConfig = ARWorldTrackingConfiguration() worldConfig.planeDetection = .horizontal // worldConfig.providesAudioData = true // open here -----> Error: arView.session.run(worldConfig) addTestEntity(arView: arView) return arView } func updateUIView(_ uiView: ARView, context: Context) {} func makeCoordinator() -> Coordinator { Coordinator() } class Coordinator: NSObject, ARSessionDelegate, ARSessionObserver { func session(_ session: ARSession, didOutputAudioSampleBuffer audioSampleBuffer: CMSampleBuffer) { } } } func addTestEntity(arView: ARView) { let mesh = MeshResource.generatePlane(width: 0.5, depth: 0.35) guard let url = Bundle.main.url(forResource: "videoplayback", withExtension: "mp4") else { return } let player = AVPlayer(url: url) let videoMaterial = VideoMaterial(avPlayer: player) let model = ModelEntity(mesh: mesh, materials: [videoMaterial]) model.transform.translation.y = 0.05 let anchor = AnchorEntity(.plane(.horizontal, classification: .any, minimumBounds: SIMD2<Float>(0.2, 0.2))) anchor.children.append(model) player.play() arView.scene.anchors.append(anchor) } Error: failed to update STS state: Error Domain=com.apple.STS-N Code=1396929899 "Error: failed to signal change" UserInfo={NSLocalizedDescription=Error: failed to signal change} failed to update STS state: Error Domain=com.apple.STS-N Code=1396929899 "Error: failed to signal change" UserInfo={NSLocalizedDescription=Error: failed to signal change} ...... ARSession <0x125d88040>: did fail with error: Error Domain=com.apple.arkit.error Code=102 "Required sensor failed." UserInfo={NSLocalizedFailureReason=A sensor failed to deliver the required input., NSUnderlyingError=0x302922dc0 {Error Domain=AVFoundationErrorDomain Code=-11819 "Cannot Complete Action" UserInfo={NSLocalizedDescription=Cannot Complete Action, NSLocalizedRecoverySuggestion=Try again later.}}, NSLocalizedRecoverySuggestion=Make sure that the application has the required privacy settings., NSLocalizedDescription=Required sensor failed.} iOS 17.5.1 Xcode 15.4
2
0
150
3d
Multiplayer testing with the new TabletopKit
Hey, Just watched and started investigating a new TabletopKit framework, which looks fantastic. I'm looking at how multiplayer can be tested. Found info about testing on multiple real devices here - https://developer.apple.com/documentation/tabletopkit/tabletopkitsample#Start-a-multiplayer-game-on-devices I wanted to ask about options for testing multiplayer on simulators, maybe with a simulator + a single real device. Unfortunately, having multiple VisionPros for indie development is unrealistic, so I hope there are ways to do it.
1
0
216
3d
Issue with Hand Occlusion in a Metal CompositorLayer
I have an issue with hand occlusion in immersive mode. I have an entry view for the app and a Metal CompositorLayer (which is the immersive volume) where I have set .upperLimbVisibility(Visibility.hidden). The problem is that when I dismiss the entry view, sometimes it hides the hands and sometimes it doesn't (randomly). @main struct AVPainterApp: App { @State var hand: Int32 = 0 var body: some Scene { WindowGroup() { ContentView(hand: $hand) } .windowResizability(.contentSize) ImmersiveSpace(id: "ImmersiveSpace") { CompositorLayer(configuration: MetalLayerConfiguration()) { layerRenderer in SpatialSceneRun(layerRenderer, hand) } } .upperLimbVisibility(Visibility.hidden) .immersionStyle(selection: .constant(.full), in: .full) } }
1
0
63
4d
Sample Project for WWDC24 10092 Metal with Passthrough?
It’s great that we’ll be able to use Metal custom renderers in passthrough mode on visionOS. https://developer.apple.com/wwdc24/10092 This is a lot of complicated set-up, however. It’s also unclear how occlusion and custom algorithms / raytracing will work in tandem with scene understanding. May we have a project template and/or sample? Preferably with the C api and not just swift. This would be much-appreciated and helpful to everyone who wants this set-up. I’d like to see the whole process. Thank you for introducing this feature!
2
1
159
4d
State-of-the-Art 3D (no AR) on macOS using RealityKit?
What is the current recommendation for creating high-quality 3D content? The context is a hobbyist, specialised CAD app for macOS (with an iPadOS companion) that is mostly 2D but also offers a 3D visualization option (currently OpenGL). Somewhere down the line there might be an AR view but at the moment - certainly for macOS - it's purely generated 3D visualization, all rendered content. So starting with a rewrite of the 3D visualization in 2024 targeting macOS Sequoia/iPadOS 18 is RealityKit the suggested way forward? Cheers, Jay
4
0
221
4d
Metal 3.2 device memory coherency
I am seeking clarification regarding the new device-coherent memory (buffers and textures) in Metal 3.2. Do I understand the documentation correctly that this feature allows threads from different threadgroups to update data in device memory cooperatively? The documentation mentions, "[results of operations] are visible to other threads across thread groups if you synchronize them properly." How does one do proper synchronization? From what I understand, Metal has no device-scoped barriers.
1
0
171
4d
Best Practice to Add Objects at Eye Level in Reality Kit
I would think it would be common practice that when adding a new entity into your RealityView scene for them to appear in front of the user. And then the user places the entity in the scene. Image a puzzle piece appearing in front of you and you drag it to your puzzle board. if you move around your puzzle board you’d expect that wherever you are the new piece should appear in front of you. That seems applicable to a lot of applications. I can add a new entity using the head anchor but as we all know that transform is the identity so reparenting the entity to something (eg puzzle board) won’t work. I’ve been trying to use World positioning and query pose which helps but I’m stumped as to how to get the new entity to appear in front of me, no matter which way I turn. Looking for suggestions and guidance on this.
2
0
96
4d
OpenGL ES support on Apple Silicon Simulators
Hey folks, I have a legacy game that is running OpenGL ES - and it no longer works on the simulators that are running Apple Silicon, ie iPhone 15 Pro, or the 13" iPads. And yes, i'm also running on Apple Silicon (M1 Max). The apps work fine on the actual devices, but the simulator crashes on any glDrawElements with a stack that looks like the following: I have not yet seen an announcement about this not working but i've seen mention in other apps of stopping to support GL (https://github.com/maplibre/maplibre-native/issues/2351) Can anyone shed some light? I'm obviously going to try to fix it, or find a recent sample app from which to start to see what might be up. Or move to metal, but i hadn't bargained for that level of effort atm ;) Any suggestions appreciated!
4
0
168
5d