Delve into the world of graphics and game development. Discuss creating stunning visuals, optimizing game mechanics, and share resources for game developers.

All subtopics

Post

Replies

Boosts

Views

Activity

Vision OS Torus Collision Shape
Hi, I have a usdz asset of a torus / hoop shape that I would like to pass another Reality Kit Entity cube-like object through (without touching the torus) in VisionOS. Similar to how a basketball goes through a hoop. Whenever I pass the cube through, I am getting a collision notification, even if the objects are not actually colliding. I want to be able to detect when the objects are actually colliding, vs when the cube passes cleanly through the opening in the torus. I am using entity.generateCollisionShapes(recursive: true) to generate the collision shapes. I believe the issue is in the fact that the collision shape of the torus is a rectangular box, and not the actual shape of the torus. I know that the collision shape is a rectangular box because I can see this in the vision os simulator by enabling "Collision Shapes" Does anyone know how to programmatically create a torus in collision shape in SwiftUI / RealityKit for VisionOS. Followup, can I create a torus in reality kit, so I don't even have to use a .usdz asset?
3
1
1.1k
Dec ’23
Retrieve AnchorEntity Location relative to Scene?
I want to place a ModelEntity at an AnchorEntity's location, but not as a child of the AnchorEntity. ( I want to be able to raycast to it, and have collisions work.) I've placed an AnchorEntity in my scene like so: AnchorEntity(.plane(.vertical, classification: .wall, minimumBounds: [2.0, 2.0]), trackingMode: .continuous) In my RealityView update closure, I print out this entity's position relative to "nil" like so: wallAnchor.position(relativeTo: nil) Unfortunately, this position doesn't make sense. It's very close to zero, even though it appears several meters away. I believe this is because AnchorEntities have their own self contained coordinate spaces that are independent from the scene's coordinate space, and it is reporting its position relative to its own coordinate space. How can I bridge the gap between these two? WorldAnchor has an originFromAnchorTransform property that helps with this, but I'm not seeing something similar for AnchorEntity. Thank you
0
0
586
Dec ’23
How do I change the background color of a visionOS immersive space?
In visionOS, once an immersive space is opened, the background color is solid black. How do I change this? I just need to set a static color without any shading, but I can't find any documentation or examples on how to do this, and the template that comes with Xcode 15.1 beta 3 doesn't change the background color. I've searched around for information, but all I can find points back to MTKView.clearColor, which I can't use when drawing into an immersive space, since immersive spaces on visionOS use Compositor Services and not MTKView or CAMetalLayer for drawing 3D content.
2
0
1.1k
Dec ’23
iOS 16 & 17 touch input stutter on Pro Motion devices. Workaround?
The touch input stutter issue that exists since iOS 16 on devices with Pro Motion Displays has not been fixed yet. I filed a bug report in July but there isn't any progress since months. I see the problem in all games I tried. My game is fast paced so the stutters are quite obvious and I receive a lot of complaining emails. My game did run smoothly on Pro Motion devices with iOS 15. Is there a known workaround? I am seeing other developers having the same issue but I can't find any solutions. Other threads about this issue: IPhone 14 Pro stuttering in most games when using touch controls FPS drops when tapping the screen on iPhone 13 Pro Max
0
0
832
Dec ’23
Camera Index Switch(RealityKit) Shader Graph Node in Reality Composer Pro does't work.
In my project, i want to use new shadergraphmaterial to do the stereoscopic render, i notice that there is a node called Camera Index Switch Node can do this. But when i tried it , i found that : It can only output Integer type value, when i change to float value , it change back again, i don't konw if it is a bug. 2. So i test this node with a IF node,i found that it output is weird. Below is zero should output,it is black but when i change to IF node,it is grey,it is neither 0 nor 1(My IF node result is TRUE result 1, FALSE result 0) I wanna ask if this is a bug, and if this is a correct way to do the stereoscopic render.
3
0
1.5k
Dec ’23
MTLVertexDescriptor's offset -- everybody's doing it wrong
In pretty much every Metal tutorial out there, people use MTLVertexDescriptor like this: they create a struct like struct Vertex { var position: float3 var color: float3 } then a vertex array and buffer: let vertices: [Vertex] = ... guard let vertexBuffer = device.makeBuffer(bytes: vertices, length: MemoryLayout<Vertex>.stride * vertices.count, options: []) else { ... } This is all good, we have a buffer with interleaved position and color data. The problem is, when creating a vertex descriptor, they use MemoryLayout<float3>.stride as the offset for the second attribute: let vertexDescriptor = MTLVertexDescriptor() vertexDescriptor.attributes[0].format = .float3 vertexDescriptor.attributes[0].offset = 0 vertexDescriptor.attributes[0].bufferIndex = 0 vertexDescriptor.attributes[1].format = .float3 vertexDescriptor.attributes[1].offset = MemoryLayout<float3>.stride // <-- here! vertexDescriptor.attributes[1].bufferIndex = 0 vertexDescriptor.layouts[0].stride = MemoryLayout<Vertex>.stride This does not look correct to me. The code happens to work only because the stride of SIMD3<Float> (a.k.a. float3) matches the alignment of the fields in this particular struct. But if we have something like this: struct Vertex { var attr0: Float var attr1: Float var attr2: SIMD3<Float> } then the naive approach of using stride won't work. Because of padding, attr2 does not start right after the two floats, at offset 2 * MemoryLayout<Float>.stride but at offset = 16. So it seems to me that the only correct and robust way to set the vertex descriptor's offset is to use offset(of:), like this: vertexDescriptor.attributes[2].offset = MemoryLayout<Vertex>.offset(of: \.attr2)! Yet, I'm not able to find a single code example that does this. Am I missing something, or is everybody else just being careless with their offsets?
0
0
354
Dec ’23
Seeking Clarification on UsdPrimvarReader Node Functionality
Hello fellow developers, I am currently exploring the functionality of the UsdPrimvarReader node in Shader Graph Editor and would appreciate some clarification on its operational principles. Despite my efforts to understand its functionality, I find myself in need of some guidance. Specifically, I would appreciate insights into the proper functioning of the UsdPrimvarReader node, including how it should ideally operate, the essential data that should be specified in the Varname field, and the Primvars that can be extracted from a USD file. Additionally, I am curious about the correct code representation of a Primvar in USD file to ensure it can be invoked successfully. If anyone could share their expertise or point me in the right direction to relevant documentation, I would be immensely grateful. Thank you in advance for your time and consideration. I look forward to any insights or recommendations you may have.
1
0
757
Dec ’23
VisionOS IBL (ImageBasedLighting) BW only or Coloured? File formats? Shadows?
I have an immersive environment with a skybox which uses an png image inside a sphere. I added an IBL, but I am not sure what the best format / prep method is for the IBL image. I have tried several different images for my IBL, and all are very different vibes from what I have in Blender. My question is how can I create an IBL that's closest to Blender's Cycles rendering engine? However, it's a rather difficult to answer question, so I want to ask some smaller questions first. Does IBL need to be BW or will colour work? From my tests: colour works just as well. But why does Apple only show use of BW ones? Should we use BW? What is the best file format for IBL? Any pros/cons? Or should we just test out each format and check visually. From my tests: PNG, OpenEXR (.exr), Radiance HDR (.hdr) all work. But which format is recommended? Will IBL on visionOS create shadows for us? In Blender an HDRI gives shadows. From my tests: No, IBL does not provide shadows on your loaded environment/meshes. Is "shadow baking" the only solution for the time being? Looking at a scene in Blender which uses HDRI as global lighting, how can we best "prep" the IBL image that will give the closest light similar to Blender's Cycles rendering engine? From my tests: I tried (as shown below) A) make a render of just the Blender HDRI (without meshes) via 360-degree panoramic camera. → Usage as IBL makes everything too bright. B) make a render of the entire Blender scene via 360-degree panoramic camera. → Usage as IBL makes everything washed out and yellowish. C) Use the Sunlight.png from the sample project. → With this IBL the scene is too dark. D) Use the SystemIBL.exr from another sample project. → With this IBL the scene looks very flat and not realistic at all. Here I show each IBL I described above 1~4 and sample screenshots from the simulator: A) B) C) D) The atmosphere I'm aiming for as per Blender's Cycles rendering engine: Can anyone help me with my questions 1 ~ 4 above. It would give me some insight in how to create immersive environments with realistic lighting & shadows. : ) Much appreciated! — Luca
3
1
1.6k
Dec ’23
RealityView Attachments
I have a RealityView and I want to add an Entity with an Attachment. Assuming I have a viewModel manage my entities, and the addEntityGesture() will add a new Entity under the rootEntity. RealityView { content, attachments in // Load initial content content.add(viewModel.rootEntity) } update: { updateContent, updateAttachments in // } attachments: { // } .gesture(addEntityGesture()) I know that we can create attachment in the attachments closure, and add those attachments as entities in our make closure, however, what if I want to add entity with an attachment on the fly?
0
0
695
Dec ’23
Scenekit crash on iOS 17
I found Scenekit crash on iOS 17 very frequently for all device on iOS 17 here is crash trace Crashed: com.apple.scenekit.renderingQueue.SCNView0x15878c630 0 SceneKit 0x3eee4 C3DMatrix4x4GetAffineTransforms + 344 1 SceneKit 0x30208 C3DAdjustZRangeOfProjectionInfos + 140 2 SceneKit 0x2c0a90 C3DCullingContextSetupPointOfViewMatrices + 700 the attachment have the whole log Crash Log have anybody know how fo fix it
1
0
611
Dec ’23
Strange error when using memory barrier in MTLRenderCommandEncoder with imageblocks
I'm attempting to put two mesh draws into a MTLRenderCommandEncoder with a memory barrier between them. I'm also using image blocks in the fragment functions in the two pipelines. Something like this: [encoder setRenderPipelineState:pipeline1]; [encoder drawMeshThreadgroups:threadgroupsPerGrid threadsPerObjectThreadgroup:threadsPerObjectThreadgroup threadsPerMeshThreadgroup:threadsPerMeshThreadgroup]; [encoder memoryBarrierWithScope:MTLBarrierScopeBuffers afterStages:MTLRenderStageMesh beforeStages:MTLRenderStageObject]; [encoder setRenderPipelineState:pipeline2]; [encoder drawMeshThreadgroups:threadgroupsPerGrid threadsPerObjectThreadgroup:threadsPerObjectThreadgroup threadsPerMeshThreadgroup:threadsPerMeshThreadgroup]; I get a strange error: Execution of the command buffer was aborted due to an error during execution. Too many unique viewports, scissor rectangles or depth-bias values to support memoryless render pass attachments. (0000000c:kIOGPUCommandBufferCallbackErrorExceededHardwareLimit) I'm not using multiple viewports or scissor rectangles and I'm not using depth bias. I don't have memoryless attachments, though as mentioned, I am using imageblocks. Without the memory barrier I don't get the error. Using memoryBarrierWithResources rather than memoryBarrierWithScope This is on an M2 Max running 14.3 Beta (23D5033f) I can't tell if I encountered a real limitation or a Metal driver bug.
1
1
384
Dec ’23
Multiple root level objects Error [USDZ/Reality Composer Pro]
Hey everyone, I'm running into this issue of my USDZ model not showing up in Reality Composer Pro, exported from Blender as a USD and converted in Reality Converter. See Attached image: It's strange, because the USDz model appears fine in Previews. But once it is brought inside RCP, I receive this pop up, and does not appear. Not sure how to resolve this multiple root level issue. If anyone can point me in the right direction or any feedback, all is much appreciated! Thank you!
1
0
1.1k
Dec ’23
Apple Core Package for Unity
I have installed the aple game kit and when it is time to build the packages i go to unity pakage manager and find the .tgz file and open it but the apple core pakage does not appear on the list inside package manager. Using unity 2022.2.2 [08:53:18] [Package Manager Window] Error adding package: file:./Assets/External Packages/unityplugins-main/Build/fbx20133_converter_mac.pkg.tgz. Unable to add package file: /Assets/External Packages/unitvolugins-main/Build/fbx20133 converter mac.nka.tazl: [08:56:07] [Package Manager Window] Error adding package: file:./Assets/External Packages/unityplugins-main/Build/fbx20133_converter_mac.pkq.tqz. Unable to add package tile:.. Assets/External Packages unitvplugins-mainBulld/TbX20133_converter mac.pka.taz: [08:59:311 [Package Manager Window] Error adding package: file../ Assets/External Packages/unityplugins-main/Build/fbx20133_converter_mac.pkg.tgz. unable to add package TIle….. AssETS/External Packages unityplugins-main/Bulld/TOX20133_converter_mac.pkq.tqz: [09:03:111 [Package Manager Window] Error adding package: file../Assets/External Packages/unityplugins-main/Build/fbx20133_converter_mac.pkg.tgz. unable to add package Tile:.. AssEts/Extemalackagesunitypiugins-main/Bulla/TOXU133_convener_mac.pko.toz]: ¿ Perso [Package Manager Window] Error adding package: file:./ Assets/External Packages/unityplugins-main/Build/fbx20133_converter_mac.pkg.tgz. Unable to add package [file:../Assets/External Packages/unityplugins-main/Build/fbx20133_converter_mac.pkg.tgz]: The file L/var/folders/39/drmm3vg15tsggs7nt7z360lh0000gn/T/.tmp-9365-Q2GPWMjILnkO/package.json] cannot be found
1
0
749
Dec ’23
Question about statistics of shader per line profile in xcode
Hi, I am using xcode frame capture to profile my app's shader. And I got some question about the shader per line profile statistics. Please see the two screen shot first, it is my compute shader. Begin: End: The first image is the head of the shader. The profile show's that the shader entry function takes 72.44% of the time. And at the end of the shader, the profile shows that the right brace '}' takes 60.45%. Here is my question: How to properly understand the profile data? What's the real performance data of this shader? Why the shader entry function does not take 100% of the time? Can someone help me to answer the question? Thanks! Boson
0
1
613
Dec ’23
Is CAMetalDisplayLink expected to support Metal Performance HUD?
I've been attempting to use the new CAMetalDisplayLink to simplify the code needed to sync my rendering with the display across Apple platforms. One thing I noticed since moving to using CAMetalDisplayLink is that the Metal Performance HUD which I had previously been using to analyze the total memory used by my app (among other things) is suddenly no longer appearing when using CAMetalDisplayLink. This issue can be reproduced with the Frame Pacing sample from WWDC23 Anyone from Apple know if this is expected behavior or have an idea on how to get this to work properly? I've filed FB13495684 for official review.
3
0
768
Dec ’23
When is a `simdgroup_barrier()` required?
Metal offers both threadgroup_barrier() and simdgroup_barrier(). I understand the need for threadroup barriers — it would not be possible to rely on well cooperation between threads in a threadgroup without them, as different threads can execute on different SIMD partitions at different times. But I don't really get the simdgroup_barrier() — it was my impression that all threads in a simdgroup execute in lockstep and this if one thread in a simdgroup makes progress, all other active threads in the simdgroup are also guaranteed to make progress. If this were not the case we'd need to insert simdgroup barrier pretty much any time we read or write any storage or perform SIMD-scoped operations. It doesn't seem like Apple uses simdgroup_barrier() in any of their sample code. In fact, it seems like it's a no-op on current Apple Silicon hardware. Is there a situation when I need to use simdgroup barriers or is this a superfluous operation? P.S. It seems that Apple engineers are as confused by this as I am, see https://github.com/ml-explore/mlx/blame/1f6ab6a556045961c639735efceebbee7cce814d/mlx/backend/metal/kernels/scan.metal#L355
1
0
849
Dec ’23
App renders as a black on screen
Hello, I am creating an application that is cross-platform with Flutter, the problem is that when I launch my application on my Macbook there is only a black page displayed. This is a recurring problem with all Flutter applications on this Mac. When I debug my application, this is what appears in the console. Error submitting command buffer. 2023-12-27 15:58:12.468 tranzic[2333:21044] Error Domain=MTLCommandBufferErrorDomain Code=4 "Ignored (for causing prior/excessive GPU errors) (00000004:kIOAccelCommandBufferCallbackErrorSubmissionsIgnored)" UserInfo={NSLocalizedDescription=Ignored (for causing prior/excessive GPU errors) (00000004:kIOAccelCommandBufferCallbackErrorSubmissionsIgnored), MTLCommandBufferEncoderInfoErrorKey=( "<errorState: MTLCommandEncoderErrorStatePending, label: (null), debugSignposts: (null)>", "<errorState: MTLCommandEncoderErrorStatePending, label: (null), debugSignposts: (null)>", "<errorState: MTLCommandEncoderErrorStatePending, label: (null), debugSignposts: (null)>" )} Error submitting command buffer. 2023-12-27 15:58:18.455 tranzic[2333:21044] Error Domain=MTLCommandBufferErrorDomain Code=4 "Ignored (for causing prior/excessive GPU errors) (00000004:kIOAccelCommandBufferCallbackErrorSubmissionsIgnored)" UserInfo={NSLocalizedDescription=Ignored (for causing prior/excessive GPU errors) (00000004:kIOAccelCommandBufferCallbackErrorSubmissionsIgnored), MTLCommandBufferEncoderInfoErrorKey=( "<errorState: MTLCommandEncoderErrorStatePending, label: (null), debugSignposts: (null)>", "<errorState: MTLCommandEncoderErrorStatePending, label: (null), debugSignposts: (null)>", "<errorState: MTLCommandEncoderErrorStatePending, label: (null), debugSignposts: (null)>" )} I have a Macbook Pro mid-2012 running macOS Monterey and here's an issue I opened on the flutter repo for more details. https://github.com/flutter/flutter/issues/137859
2
2
636
Dec ’23