Delve into the world of graphics and game development. Discuss creating stunning visuals, optimizing game mechanics, and share resources for game developers.

All subtopics
Posts under Graphics & Games topic

Post

Replies

Boosts

Views

Activity

ARMeshAnchor Data with RealityView
I want to use SwiftUI and RealityView to get AR scene understanding data (ARMeshAnchor) on iOS devices with LiDAR. The only way we can do that is by using ARSession (unless there is another way). However in previous iOS 18 builds there was this function: https://developer.apple.com/documentation/realitykit/spatialtrackingsession/run(_:session:arconfiguration:) , which worked with SpatialTrackingSession and a custom ARSession together. This function in the the latest iOS and Xcode has since been removed in the RealityKit framework but still there on documentation. I also wanted to get ARFaceAnchor data which I still cannot get without ARSession, the closest I can get is by using: let target = AnchoringComponent.Target.face let anchoringComponent = AnchoringComponent(target, trackingMode: .predicted) entity = Entity() entity!.components.set(anchoringComponent) But I still can't find a way to get the current frame (ARFrame) or the anchors ([ARAnchor]) in the view. Alternatively if I use if I use this function: https://developer.apple.com/documentation/realitykit/spatialtrackingsession/run(_:) and start the ARSession separately. The session (didUpdate and didAdd) only runs for a few frames before getting interrupted. And if I completely remove SpatialTrackingConfiguration and just run the ARSession. There still is a valid tracked entity for the AnchoringComponent.Target.face component. IF in the configuration for the ARSession I use the ARWorldTrackingConfiguration with face tracking. And I still get updated facial data each frame. But the ARSession didUpdate or didAdd functions don't get called passed the first few frames. Interestingly if I switch the RealityViewCameraContent.RealityViewCamera to .virtual. I get ARMeshAnchor and ARFaceAnchor data, but no camera feed (as expected). This with or without SpatialTrackingConfiguration. My overarching question is what is the proper way to access ARMeshAnchors and other ARAnchors created by the system and track them live while also using SwiftUI. GitHub Repo with sample project can be found here: https://github.com/bpate75/RealityViewTesting
1
1
442
Feb ’25
Xcode Metal geometry inspector uses wrong NDC space?
When inspecting the geometry in Xcode's metal debugger, I noticed that the shown "frustrum box" didn't make sense. Since Metal uses depth range 0,1 in NDC space, I would expect a vertex that is projected to z:0 to be on the front clipping plane of the frustrum shown in the geometry inspector. This is however not the case. A vertex with ndc z:0 is shown halfway inside the frustrum. Vertices with ndc z less than 0 are correctly culled during rendering, while the geometry inspector's frustrum shows that the vertex is stil inside the frustrum. The image shows vertices that are visually in the middle of the frustrum on z axis, but at the same time the out position shows that they are projected to z:0. How is this possible, unless there's a bug in the geometry inspector?
1
0
527
Feb ’25
alternative for CustomShader in visionOS
Following the post on https://developer.apple.com/documentation/realitykit/custommaterial it's simple to use shader for materials and get uniforms and params from each vertex. However it's not available for visionOS. Any alternative to use in this case? I want to write shader to fill material by myself. (I have shader experience from web, familiar with fragment shader)
1
0
446
Feb ’25
USDZ File Crashes in QuickLook on iPad 9th Gen (iPadOS 18.3) – Urgent Help Needed
Hi everyone, I’m experiencing a critical issue with USDZ files created in Reality Composer on an iPad 9th Generation (iPadOS 18.3). The files work perfectly on iPads from the 10th Generation onwards and on iPad Pros. However, on older devices like the iPad 9th Generation and older iPhones, QuickLook (file preview) crashes when opening them. This is a major issue because these USDZ files are part of an exhibition where artworks are extended with AR elements via a web page. If some visitors cannot view the 3D content, it significantly impacts the experience. What’s puzzling is that two years ago, we exported USDZ files from Reality Composer, made them available via a website, and they worked flawlessly on all devices, including older iPads and iPhones. Now, with the latest iPadOS, they consistently crash on older devices. Has anyone encountered a similar issue? Are there known limitations with QuickLook on older devices, or is there a way to optimize the USDZ files to prevent crashes? Could this be related to changes in iPadOS or RealityKit? Any advice or workaround would be greatly appreciated! Thanks in advance!
1
0
583
Feb ’25
Will metal-cpp-extensions headers for Appkit and MetalKit be maintained?
Hi, The metal-cpp distribution appears to only contain headers for Foundation and Quartzcore. The LearnMetalCPP download [1] provides a ZIP with an metal-cpp-extensions directory containing AppKit.hpp and MetalKit.hpp headers. First question: Are these headers distributed anywhere else more publicly? Without these headers only the renderer can be fully written in C++ as far as I can tell, i.e. no complete C++ NSApplication. Second question: Will these headers, if needed, be maintained (e.g. updated and/or extended) by Apple along side metal-cpp? [1] https://developer.apple.com/metal/cpp/ Thank you and regards.
2
0
2k
Feb ’25
CGEvent Not Working
I am trying to simulate a paste command and it seems to not want to paste. It worked at one point with the same code and now is causing issues. My code looks like this: ` func simulatePaste() { guard let source = CGEventSource(stateID: .hidSystemState) else { print("Failed to create event source") return } let keyDown = CGEvent(keyboardEventSource: source, virtualKey: CGKeyCode(9), keyDown: true) let keyUp = CGEvent(keyboardEventSource: source, virtualKey: CGKeyCode(9), keyDown: false) keyDown?.flags = .maskCommand keyUp?.flags = .maskCommand keyDown?.post(tap: .cgAnnotatedSessionEventTap) keyUp?.post(tap: .cgAnnotatedSessionEventTap) print("Simulated Cmd + V") } I know that there is some issues around permissions and so in my Info.plist I have this: <string>NSApplication</string> <key>NSAppleEventsUsageDescription</key> <string>This app requires permission to send keyboard input for pasting from the clipboard.</string> I have also disabled sandbox. It does ask me if I want to give the app permissions but after approving it, it still doesn't paste.
1
0
430
Feb ’25
SCNCamera SSAO on visionOS
Hi Looking at the documentation for screenSpaceAmbientOcclusionIntensity, I noticed that it says this is supported on visionOS 1.0+: https://developer.apple.com/documentation/scenekit/scncamera/screenspaceambientocclusionintensity Could someone enlighten me as to how that would work? As far as I know, we don't use an SCNCamera on visionOS. So, what's the idea here? Can we activate SSAO on visionOS?
1
0
406
Feb ’25
Transferring Apps with iCloud KVS
Hi All! I'm being asked to migrate an app which utilizes iCloud KVS (Key Value Storage). This ability is a new-ish feature, and the documentation about this is sparse [1]. Honestly, the entire documentation about the new iCloud transfer functionality seems to be missing. Same with Game Center / GameKit. While the docs say that it should work, I'd like to understand the process in more detail. Has anyone migrated an iCloud KVS app? What happens after the transfer goes through, but before the first release? Do I need to do anything special? I see that the Entitlements file has the TeamID in the Key Value store - is that fine? <key>com.apple.developer.ubiquity-kvstore-identifier</key> <string>$(TeamIdentifierPrefix)$(CFBundleIdentifier)</string> Can someone please share their experience? Thank you! [1] https://developer.apple.com/help/app-store-connect/transfer-an-app/overview-of-app-transfer
4
0
2.2k
Feb ’25
GameKit Matchmaking
Hi all im having a variety of issues with gamekit matchmaking. On the simulator the matchmaking ui pops up and I can click Quick Match, then immediately "Failed to find Players" this is the same with a real Apple ID and a sandbox account. If I use real devices the app at least discovers a match, but then the match none of the delegate methods for the match ever get called and the logs are filled with socket not connected and various errors. My questions are: Should match making via quick match work in the simulator, I have seen tutorial videos etc of this working, but I can't seem to get it to work. How do people debug issues with GameCenter / Gamekit to find out why its not able to connect? Many thanks in advance
1
0
444
Feb ’25
Performance Regression: In iOS 18.2 3D rotation in volume rendering is not smooth as compared to iOS 18.1
We as a team of engineers work on an app intended to visualize medical images. The type of situations where the app is used involves time critical decision making for acute clinical conditions. Stability of the app and performance are of utmost importance and can directly help timely treatment action. The app we are developing uses multiple libraries and tools like vtk, webgl, opengl, webkit, gl-matrix etc. The problem specifically can be described as follows, it has been observed that when 3D volume is rendered in the app and we try to rotate the volume the rotation is slow, unresposive and laggy. Specifically, we have noticed that iOS 18.1 the volume rotation is much smoother as compared to latest iOS 18.2. Eariler, we have faced somewhat similar issue with iOS 17 but it got improved in iOS 18.1. This performance regression is affecting the user experience in our healthcare application. We have taken reference from the cornerstone.js code and you can reproduce the issue using the following example: https://www.cornerstonejs.org/live-examples/volumeviewport3d Steps to Reproduce: Load the above mentioned test example on an iPhone running version 18.2 using safari. Perform volume rendering using the provided dataset. Measure the time taken by volume for each rotate or drag action. Repeat the same steps on an iPhone running version 18.1 for comparison. Additional Information: Device Model Tested: iPhone12, iPhone13, iPhone14 iOS Version With Issue: 18.2 18.3(Beta) I would appreciate any insights or suggestions on how to address this performance regression. If additional information is needed, please let me know. Thank you.
4
6
978
Feb ’25
Subdivision shows in RealityComposerPro but not when loaded in Simulator
Hello, I am trying to use the subdivision mesh rendering option. I can see it working in RealityComposerPro: But not when loading asset and displaying in Simulator: Using this code: import SwiftUI import RealityKit import RealityKitContent struct AirspaceView: View { // MARK: - VIEW BODY var body: some View { RealityView { content in if let a = try? await Entity(named: "Models/Test/Test.usdc", in: realityKitContentBundle) { content.add(a) } } } } Any ideas why?
2
1
546
Feb ’25
Metal calls hanging/stuck if app is started quickly after login
Our app uses Metal for image processing. We have found that if our app (and its possible intensive image processing) is started quickly after user is logged in, then calls to Metal may be hanging/stuck for a good while. Example: it can take 1-2 minutes for something that usually takes 3-5 seconds! Metal threads are just hanging in a memmove... In Activity Monitor we see a lot of things are happening right after log-in. But why Metal calls are blocking for so long is unknown to us... The workaround is to wait a minute before we start our app and start intensive image processing using Metal. But hard to explain this workaround to end-users... It doesn't happen on all computers but fairly easy to reproduce on some computers. We are using macOS 15.3.1. M1/M3 Max. Any good ideas for how to proceed with this problem and possible reach out to Apple engineers? Thanks! :)
2
0
436
Feb ’25
Issue with FBX import -- models are ultra small
When importing FBX animations (generated by Cinema 4d or Blender) the models come in very far way and cannot resize or zoomed in. I have tried every setting from both programs to no avail. Is there a secret to providing the right export options? When importing without animations/and rigging the model imports fine and correct size. But once motion is included, something is awry. I also tried changing base units in Converter, but did not work. I have attache my model heirarchy in C4D as well as the imported result. It appears the animation is imported, as I can see it move, but can barely see it :)
3
0
550
Feb ’25
Black Screen in GPTK – DX 12.1 / Shader Model 6.5 Issue?
Hey everyone, I’m trying to run Kingdom Come: Deliverance 2 using the Game Porting Toolkit, but I’m encountering a black screen when launching the game. From what I know about the game’s requirements, it might be using Shader Model 6.5, which supports advanced features like DirectX Raytracing (DXR) Tier 1.1. This leads me to suspect that the issue could be related to missing support for DirectX 12.1 Features or Shader Model 6.5 in GPTK. Does anyone know if these features are currently supported by GPTK? If not, are there any plans to implement them in future updates? Alternatively, is there any workaround for games that rely on Shader Model 6.5 and ray tracing? Thanks a lot for your help!
2
0
647
Feb ’25
Can't link metal-cpp to Modern Rendering With Metal sample
There is a sample project from Apple here. It has a scene of a city at night and you can move in it. It basically has 2 parts: application code written in what looks like Objective-C (I am more familiar with C++), which inherits from things like NSObject, MTKView, NSViewController and so on - it processes input and all app-related and window-related stuff. rendering code that also looks like Objective-C. Btw both parts are mostly in .mm files (Obj-C++ AFAIK). The application part directly uses only one class from the rendering part - AAPLRenderer. I want to move the rendering part to C++ using metal-cpp. For that I need to link metal-cpp to the project. I did it successfully with blank projects several times before using this tutorial. But with this sample project Xcode can't find Foundation/Foundation.hpp (and other metal-cpp headers). The error says this: Did not find header 'Foundation.hpp' in framework 'Foundation' (loaded from '/Applications/Xcode.app/Contents/Developer/Platforms/MacOSX.platform/Developer/SDKs/MacOSX15.0.sdk/System/Library/Frameworks') Pls help
2
0
839
Feb ’25
RealityView IOS Navigation
I have a visionOS app that I’m adding support for IOS and will like to keep using RealityView. I know there are the following modifiers to add some navigation .realityViewCameraControls(.orbit) .realityViewCameraControls(.dolly) .realityViewCameraControls(.pan) But how can I add more than one? For example I would like to orbit with one finger, Pan with 2 fingers and dolly by pinching. Is this possible and if so can someone share some sample code on how to achieve that? Thanks, Guillermo
0
1
432
Feb ’25
Metal runtime shader library compilation and linking issue
In my project I need to do the following: In runtime create metal Dynamic library from source. In runtime create metal Executable library from source and Link it with my previous created Dynamic library. Create compute pipeline using those two libraries created above. But I get the following error at the third step: Error Domain=AGXMetalG15X_M1 Code=2 "Undefined symbols: _Z5noisev, referenced from: OnTheFlyKernel " UserInfo={NSLocalizedDescription=Undefined symbols: _Z5noisev, referenced from: OnTheFlyKernel } import Foundation import Metal class MetalShaderCompiler { let device = MTLCreateSystemDefaultDevice()! var pipeline: MTLComputePipelineState! func compileDylib() -> MTLDynamicLibrary { let source = """ #include <metal_stdlib> using namespace metal; half3 noise() { return half3(1, 0, 1); } """ let option = MTLCompileOptions() option.libraryType = .dynamic option.installName = "@executable_path/libFoundation.metallib" let library = try! device.makeLibrary(source: source, options: option) let dylib = try! device.makeDynamicLibrary(library: library) return dylib } func compileExlib(dylib: MTLDynamicLibrary) -> MTLLibrary { let source = """ #include <metal_stdlib> using namespace metal; extern half3 noise(); kernel void OnTheFlyKernel(texture2d<half, access::read> src [[texture(0)]], texture2d<half, access::write> dst [[texture(1)]], ushort2 gid [[thread_position_in_grid]]) { half4 rgba = src.read(gid); rgba.rgb += noise(); dst.write(rgba, gid); } """ let option = MTLCompileOptions() option.libraryType = .executable option.libraries = [dylib] let library = try! self.device.makeLibrary(source: source, options: option) return library } func runtime() { let dylib = self.compileDylib() let exlib = self.compileExlib(dylib: dylib) let pipelineDescriptor = MTLComputePipelineDescriptor() pipelineDescriptor.computeFunction = exlib.makeFunction(name: "OnTheFlyKernel") pipelineDescriptor.preloadedLibraries = [dylib] pipeline = try! device.makeComputePipelineState(descriptor: pipelineDescriptor, options: .bindingInfo, reflection: nil) } }
4
0
879
Feb ’25