Delve into the world of graphics and game development. Discuss creating stunning visuals, optimizing game mechanics, and share resources for game developers.

All subtopics
Posts under Graphics & Games topic

Post

Replies

Boosts

Views

Activity

Showing a MTLTexture on an Entity in RealityKit
Is there any standard way of efficiently showing a MTLTexture on a RealityKit Entity? I can't find anything proper on how to , for example, generate a LowLevelTexture out of a MTLTexture. Closest match was this two year old thread. In the old SceneKit app, we would just do guard let material = someNode.geometry?.materials.first else { return } material.diffuse.contents = mtlTexture Our flow is as follows (for visualizing the currently detected object): Camera-Stream -> CoreML Segmentation -> Send the relevant part of the MLShapedArray-Tensor to a MTLComputeShader that returns a MTLTexture -> Show the resulting texture on a 3D object to the user
5
0
963
2w
PortalComponent – allow world content to peek out
Hello, I've been tinkering with PortalComponent on visionOS a bit but noticed that the content of the WorldComponent is always clipped to the mesh geometry of whatever entities have the PortalComponent applied. Now I'm wondering if there is any way or trick to allow contents of the portal to peek out – similar to the Encounter Dinosaurs experience on Vision Pro (I assume it also uses PortalComponent?). I saw that PortalComponent has a clippingPlane property (https://developer.apple.com/documentation/realitykit/portalcomponent/clippingplane-swift.property). But so far I haven't been able to achieve a perceptible visual difference with it. If possible I would like to avoid hacky tricks using duplicate meshes or similar to achieve this. Thanks for any hints!
5
0
1.5k
Dec ’24
SKTexture used for SceneKit object is rendered too bright
I would like to preload and use some images for both SpriteKit and SceneKit models (my game uses SceneKit with a SpriteKit overlay), and as far as I can see the only efficient way would be to create and preload SKTexture objects which can be supplied to SKSpriteNode(texture:) and SCNMaterial.diffuse.contents. The problem is that SKTexture are rendered too bright in SceneKit, for some unknown reason. Here a comparison between rendering an image (from URL) and a SKTexture: And the code that produces it: let url = Bundle.main.url(forResource: "art.scnassets/texture.png", withExtension: nil)! let plane1 = SCNPlane(width: 10, height: 10) plane1.firstMaterial!.diffuse.contents = url.path let node1 = SCNNode(geometry: plane1) node1.position.x = -5 scene.rootNode.addChildNode(node1) let plane2 = SCNPlane(width: 10, height: 10) plane2.firstMaterial!.diffuse.contents = SKTexture(image: NSImage(byReferencing: url)) let node2 = SCNNode(geometry: plane2) node2.position.x = 5 scene.rootNode.addChildNode(node2) This issue was already mentioned in this other post, but since I wasn't notified of the reply from Quinn asking about the feedback number I created at the time, it didn't make any progress.
5
0
907
Nov ’24
How to use MTKTextureLoader to load png data
I am trying to load some PNG data with MTKTextureLoader newTextureWithData,but the result shows wrong at the alpha area. Here is the code. I have an image URL, after it downloads successfully, I try to use the data or UIImagePNGRepresentation (image), they all show wrong. UIImage *tempImg = [UIImage imageWithData:data]; CGImageRef cgRef = tempImg.CGImage; MTKTextureLoader *loader = [[MTKTextureLoader alloc] initWithDevice:device]; id<MTLTexture> temp1 = [loader newTextureWithData:data options:@{MTKTextureLoaderOptionSRGB: @(NO), MTKTextureLoaderOptionTextureUsage: @(MTLTextureUsageShaderRead), MTKTextureLoaderOptionTextureCPUCacheMode: @(MTLCPUCacheModeWriteCombined)} error:nil]; NSData *tempData = UIImagePNGRepresentation(tempImg); id<MTLTexture> temp2 = [loader newTextureWithData:tempData options:@{MTKTextureLoaderOptionSRGB: @(NO), MTKTextureLoaderOptionTextureUsage: @(MTLTextureUsageShaderRead), MTKTextureLoaderOptionTextureCPUCacheMode: @(MTLCPUCacheModeWriteCombined)} error:nil]; id<MTLTexture> temp3 = [loader newTextureWithCGImage:cgRef options:@{MTKTextureLoaderOptionSRGB: @(NO), MTKTextureLoaderOptionTextureUsage: @(MTLTextureUsageShaderRead), MTKTextureLoaderOptionTextureCPUCacheMode: @(MTLCPUCacheModeWriteCombined)} error:nil]; }] resume];
5
0
591
May ’25
Tile Shaders performance when writing to tile texture vs. resolve texture
I am working on a custom resolve tile shader for a client. I see a big difference in performance depending on where we write to: 1- the resolve texture of the color attachment 2- a rw tile shader texture set via [renderEncoder setTileTexture: myResolvedTexture] Option 2 is more than twice as slow than option 1. Our compute shader writes to 4 UAVs so just using the resolve texture entry is not possible. Why such a difference as there is no more data being written? Can option 2 be as fast as option 1? I can demonstrate the issue in a modified version of the Multisample code sample.
5
0
536
Feb ’25
PSVR2 controllers don't report anything in snapshot
I typically read an extended gamepad capture() and get all state. But PSVR2 controllers seem to report nothing. So the stick and other buttons don't do anything in a built app. They register as left/right controllers. This on vOS 26, Xcode 26, etc. They work correctly in the main icon view, although they don't honor inverted vertical and horiztonal scrolling. Both of the default scrolls just feel wrong. When I move left I'm want to scroll level not right. Same for up/down.
5
1
592
1w
Metal and Swift Concurrency
Hi, Introducing Swift Concurrency to my Metal app has been a bit challenging as Swift Concurrency is limited by the cooperative thread pool. GPU work is obviously not CPU bound and can block forward moving progress, especially when using waitUntilCompleted on the command buffer. For concurrent render work this has the potential of under utilizing the CPU and even creating dead locks. My question is, what is the Metal's teams general recommendation when it comes to concurrency? It seems to me that Dispatch or OperationQueues are still the preferred way for Metal bound tasks in order to gain maximum performance? To integrate with Swift Concurrency my idea is to use continuations that kick off render jobs via Dispatch or Queues? Would this be the best solution to bridge async tasks with Metal work? Thanks!
5
0
1k
Apr ’25
GameKit not working as expected in iOS 26.
I just upgraded my macOS, Xcode and Simulator all to the newest beta version 26. Then I found two issues when building my app with Xcode 26 and running it on simulator 26. The game center access point no longer shows up in the app. This is how it's configured in the past. And it still works on simulator 18.4 func authenticatePlayer() { GKAccessPoint.shared.location = .topTrailing self.localPlayer.authenticateHandler = { viewController, error in if let viewController = viewController { // can present Game Center login screen } else if self.localPlayer.isAuthenticated { // game can be started } else { // user didn't log in, continue the game without game center } } } After game ended, the leaderboard won't load. This is how it's implemented in the past. It's still working in simulator 18.4 struct GameCenterView: UIViewControllerRepresentable { @Environment(\.presentationMode) var presentationMode ... func makeUIViewController(context: Context) -> GKGameCenterViewController { let viewController = GKGameCenterViewController( leaderboardID: getLeaderBoardID(with: leaderBoardGameMode), playerScope: .global, timeScope: .allTime ) viewController.gameCenterDelegate = context.coordinator return viewController } func updateUIViewController(_ uiViewController: GKGameCenterViewController, context: Context) {} func makeCoordinator() -> Coordinator { Coordinator(self) } class Coordinator: NSObject, GKGameCenterControllerDelegate { let parent: GameCenterView init(_ parent: GameCenterView) { self.parent = parent } func gameCenterViewControllerDidFinish(_ gameCenterViewController: GKGameCenterViewController) { parent.presentationMode.wrappedValue.dismiss() } } }
5
2
304
2w
GKGameCenterViewController Is Blank On iOS 18
I'm displaying a GKGameCenterViewController after successfully authenticating and on iOS 18.0 and 18.1, I get a black screen. As a sanity check GKLocalPlayer.local.isAuthenticated is also returning true. The same code works just fine on iOS 17. Is there something that needs to be done on iOS 18 and above?
4
1
811
Nov ’24
Metal 4 & Acceleration Structures
I have really enjoyed looking through the code and videos related to Metal 4. Currently, my interest is to update a ReSTIR Project and take advantage of more robust ways to refit acceleration Structures and more powerful ways to access resources. I am working in Swift and have encountered a couple of puzzles: What is the 'accepted' way to create a MTL4BufferRange to store indices and vertices? How do I properly rewrite Swift code to build and compact an Acceleration Structure? I do realize that this is all in Beta and will happily look through Code Samples this Fall. If other guidance is available earlier, that would be fabulous! Thank you
4
0
547
4w
USDZ files with camera can't be opened on IOS 18.2/iPadOS 18.2 correctly.
Hi experts, When I open a USDZ file which contains perspective cameras by "Files" app in IOS 18.2/iPadOS 18.2, I can't see anything. And when I open the USDZ file in IOS 18.1/iPadOS 18.1, it works well. On the other hand, when I open a USDZ file which contains orthographic cameras in IOS 18.1 or IOS 18.2, the scene is stuck. Could you help to solve these issues please? Thanks.
4
2
613
Dec ’24
SCNNode from MDLMesh not rendered
I am writing an app to create 3D objects with curved surfaces such as a metal cabinet knob using SceneKit and Model I/O. I want the surfaces to be smooth so that edges between adjacent polygon faces are not visible. According to the documentation for MDLMesh.addNormals(withAttributeNamed: creaseThreshold:), a positive creaseThreshold value lower than 1.0 will interpolate sharper angles between faces into smooth surfaces. I have not been able to get this to work, and I need help with it. The lines of code where the problem occurs are shown here. let mesh = MDLMesh(scnGeometry: surfaceGeometry) // mesh.addNormals(withAttributeNamed: "MDLVertexAttributeNormal", creaseThreshold: 0.9) surfaceGeometry = SCNGeometry(mdlMesh: mesh) When the code is executed with middle line commented out, the knob object is rendered as shown in the screenshot. When that line is not commented out, mesh is altered and the SCNNode for the knob is created with no errors, but the node is not rendered. The questions I have are: (1) What changes do I need the make to the code so that the node will be rendered with a smooth surface?, and (2) what is the recommended way of smoothing a curved surface so that edges between faces are not visible? The full code for the function and a screenshot of the faceted knob object are attached. ![]("https://developer.apple.com/forums/content/attachment/a17feca7-ed6f-440c-add6-760a1cbf8778" "title=Screenshot cabinet knob with faceted surface.png;width=790;height=568") code-block func cabinetKnob() -> SCNNode { let controlPoints: [(x: Float, y: Float)] = [ (0.728,-0.237), (0.176,-0.06), (0.202,0.475), (0.989,0.842), (-0.066,1.093), (-0.726,0.787) ] let pairs = bsplinePath(controlPoints) var knobProfile = [SCNVector3]() for (x,y) in pairs { knobProfile += [ SCNVector3(x: CGFloat(x), y: CGFloat(y), z: 0)] } let nProfiles = 64 // create knob by rotating knobProfile about y-axis let aIncrement: CGFloat = 2 * CGFloat.pi / CGFloat(nProfiles) // ~6 degrees var angle: CGFloat = 0 var knobVertices = knobProfile.map( { $0 } ) angle = 0 for _ in 1...nProfiles { angle += aIncrement // rotate knobProfile about y-axis knobVertices += knobProfile.map( { $0.rotate(about: .y, by: angle) } ) } let source = SCNGeometrySource(vertices: knobVertices) var indices = [[UInt16]]() var i: UInt16 = 0 var j: UInt16 = UInt16(knobProfile.count) // 1st vertex of next profile for k in 0...nProfiles { var stripIndices = [UInt16]() if k == nProfiles { j = 0 } for _ in 0...knobProfile.count-1 { stripIndices += [i, j] i += 1; j += 1 } indices += [stripIndices] } let elements: [SCNGeometryElement] = indices.map( { SCNGeometryElement(indices: $0, primitiveType: .triangleStrip) } ) var surfaceGeometry = SCNGeometry(sources: [source], elements: elements) let mesh = MDLMesh(scnGeometry: surfaceGeometry) // mesh.addNormals(withAttributeNamed: "MDLVertexAttributeNormal", creaseThreshold: 0.9) surfaceGeometry = SCNGeometry(mdlMesh: mesh) let aluminum = SCNMaterial() aluminum.lightingModel = SCNMaterial.LightingModel.physicallyBased aluminum.diffuse.contents = NSColor(srgbRed: 0.95, green: 0.95, blue: 0.95, alpha: 1.0) aluminum.roughness.contents = 0.2 aluminum.metalness.contents = 0.9 aluminum.isDoubleSided = true surfaceGeometry.materials = [ aluminum ] let node = SCNNode(geometry: surfaceGeometry) return node }
4
0
749
Nov ’24
Photogrammetry requiring lidar-capable phones, curious why
Hello! I'm currently building an app where I feed images into a Photogrammetry session to create a USDZ. Pretty straightforward, works great. We've recently started some testing on older devices, and have discovered that Photogrammetry is requiring devices that have LIDAR (we've seen some console logs referencing LIDAR if we stumble through a photogrammetry process without checking isSupported first) Judging from @swredcam's posting about ReefScan from November 24 (https://developer.apple.com/forums/thread/769221) it looks like Photogrammetry did work on those non-LIDAR devices. In my own testing on an iPhone 12 mini with iOS 17, PhotogrammetrySession says it's not supported. Since we're only feeding in a sequence of photos that have never had depth data, and they process fine on pro/max devices, we're curious why this would require a LIDAR sensor to work, when it seems like it did work without LIDAR in the past. Or is there some other limitation of non-pro devices that is causing photogrammetry to not be supported (especially on today's really powerful hardware) Thanks! ++md
4
0
547
Mar ’25
NSScreen frame location with multiple monitors
I have a Mac Studio 2023 M2 Max Running Sonoma 14.6.1 Developing in XCode 16.1 It seems that the NSScreen frame settings may be incorrect. The frame settings received from NSScreen.screens don't seem to match up with the Desktop arrangement settings in the Settings. Apologies in advance for this long post! for screen in NSScreen.screens { let name = screen.localizedName Globals.logger.debug("Globals initializeScreens - screen \(i) '\(name, privacy: .public)'") Globals.logger.debug("Globals initializeScreens - '\(screen.debugDescription, privacy: .public)'") } This is what I receive in the log: Globals initializeScreens - '<NSScreen: 0x600000ef4240; name="PHL 346E2C"; backingScaleFactor=1.000000; frame={{0, 0}, {3440, 1440}}; visibleFrame={{0, 0}, {3440, 1415}}>' Globals initializeScreens - screen 2 'Blackmagic (1)' Globals initializeScreens - '<NSScreen: 0x600000ef42a0; name="Blackmagic (1)"; backingScaleFactor=1.000000; frame={{-3840, 0}, {1920, 1080}}; visibleFrame={{-3840, 0}, {1920, 1055}}>' Globals initializeScreens - screen 3 'Blackmagic (4)' Globals initializeScreens - '<NSScreen: 0x600000ef4360; name="Blackmagic (4)"; backingScaleFactor=1.000000; frame={{-1920, 0}, {1920, 1080}}; visibleFrame={{-1920, 0}, {1920, 1055}}>' Globals initializeScreens - screen 4 'Blackmagic (2)' Globals initializeScreens - '<NSScreen: 0x600000ef43c0; name="Blackmagic (2)"; backingScaleFactor=1.000000; frame={{5360, 0}, {1920, 1080}}; visibleFrame={{5360, 0}, {1920, 1055}}>' Globals initializeScreens - screen 5 'Blackmagic (3)' Globals initializeScreens - '<NSScreen: 0x600000ef4420; name="Blackmagic (3)"; backingScaleFactor=1.000000; frame={{3440, 0}, {1920, 1080}}; visibleFrame={{3440, 0}, {1920, 1055}}>' It looks like the frame settings for Blackmagic (2) and Blackmagic (4) are switched. The setup has five monitors. Four are using the USB-C Digital AV Multiport Adapters. The output for these are streamed into a rack of A/V equipment using BlackMagic Design mini converters and monitors. My Swift application allows users to open four movies, one for each of the AV Adapters. The movies can then be played back in sync for later processing by the A/V equipment. Here are some screen captures that show my display settings. Blackmagic (1) and Blackmagic (2) are to the left of the main screen. Blackmagic (3) and Blackmagic(4) are to the right of the main screen. The desktop is hard to see but is correct. The wallpaper settings are all correct. The wallpaper is correctly ordered when displayed on the monitors. After opening the movies and using the NSScreen frame settings, the displays are incorrectly ordered. Test B and Test D are switched, which is what I would expect given the NSScreen frame values. Any ideas? I've tried re-arranging the desktops, rebooting, etc. but no luck. The code that changes the screen location is similar to this post on Stack Overflow public func setDisplay( screen: NSScreen ) { Globals.logger.log("MovieWindowController - setDisplay = \(screen.localizedName, privacy: .public)") Globals.logger.debug("MovieWindowController - setDisplay - '\(screen.debugDescription, privacy: .public)'") let dx = CGFloat(Constants.midX) let dy = CGFloat(Constants.midY) var pos = NSPoint() pos.x = screen.visibleFrame.midX - dx pos.y = screen.visibleFrame.midY - dy Globals.logger.debug("MovieWindowController - setDisplay - x = '\(pos.x, privacy: .public)', y = '\(pos.y, privacy: .public)'") window?.setFrameOrigin(pos) } The log show just what I would expect given the incorrect frame values. MovieWindowController - setDisplay = Blackmagic (1) MovieWindowController - setDisplay - '<NSScreen: 0x6000018e8420; name="Blackmagic (1)"; backingScaleFactor=1.000000; frame={{-3840, 0}, {1920, 1080}}; visibleFrame={{-3840, 0}, {1920, 1055}}>' MovieWindowController - setDisplay - x = '-3840.000000', y = '-12.500000' MovieWindowController - setDisplay = Blackmagic (2) MovieWindowController - setDisplay - '<NSScreen: 0x6000018a10e0; name="Blackmagic (2)"; backingScaleFactor=1.000000; frame={{5360, 0}, {1920, 1080}}; visibleFrame={{5360, 0}, {1920, 1055}}>' MovieWindowController - setDisplay - x = '5360.000000', y = '-12.500000' MovieWindowController - setDisplay = Blackmagic (3) MovieWindowController - setDisplay - '<NSScreen: 0x6000018cc8a0; name="Blackmagic (3)"; backingScaleFactor=1.000000; frame={{3440, 0}, {1920, 1080}}; visibleFrame={{3440, 0}, {1920, 1055}}>' MovieWindowController - setDisplay - x = '3440.000000', y = '-12.500000' MovieWindowController - setDisplay = Blackmagic (4) MovieWindowController - setDisplay - '<NSScreen: 0x6000018c9ce0; name="Blackmagic (4)"; backingScaleFactor=1.000000; frame={{-1920, 0}, {1920, 1080}}; visibleFrame={{-1920, 0}, {1920, 1055}}>' MovieWindowController - setDisplay - x = '-1920.000000', y = '-12.500000' Am I correct? I think this is driving me crazy! Thanks in advance! Edit: The mouse behavior is correct in moving across the displays!
4
0
614
Jan ’25
Float8 and Float16 "Reserved_Name__Do_not_use"
I am developing a macOS terminal app, running on an M4 Pro, and using Metal. I am not able use float8 or float16, both reporting Variable has incomplete type 'float16' (aka '__Reserved_Name__Do_not_use_float16'). Based on the system I should be able to use these. Either it is because it is also compiling to Intel, which they are not allowed, or something else. Either way I have not been able to figure out how to get past this. IIs there a compiler setting I need to set to make this work? if so which one and what setting do I need? I only want to run this on M processes, on the latest version of OS so not interested in Intel version or backward compatibility.
Topic: Graphics & Games SubTopic: Metal Tags:
4
0
152
Aug ’25
macOS Tahoe Beta 4 disabled __asm keyword for Metal
Hi, developers, I maintain a shipped app that uses string concatenation to construct Metal shader and compile on-device. Beta 4 seems disabled __asm keyword, resulting the compilation failure. The error is: v2/GEMMKernel.cpp:229: error: program_source:23:9: error: illegal string literal in 'asm' __asm("air.simdgroup_async_copy_1d.p3i8.p1i8"); The relevant code is available at https://github.com/liuliu/ccv/blob/unstable/lib/nnc/mfa/v2/GEMMHeaders.cpp#L30 although any __asm will trip this. Please give us guidance on whether this is a regression or this will be something enforced in 26 release. Personally, I would consider this as a bug given it won't impact anything "compiled" shaders. Thanks for your patience reading this!
Topic: Graphics & Games SubTopic: Metal Tags:
4
6
826
Jul ’25
Xcode Vulkan is opening two windows instead of one.
I'm a newbee at Vulkan and Xcode. I have my project on github https://github.com/flocela/OrangeSpider/ Whenever I run, two windows open instead of only one. I added testing, which means I have an OrangeSpider.xctestplan in the OrangeSpider/TestsOrangeSpider/ folder. This is my first time adding testing to an XCode project, so I think this may be where the problem is. I also get this error message: ViewBridge to RemoteViewService Terminated: Error Domain=com.apple.ViewBridge Code=18 "(null)" UserInfo={com.apple.ViewBridge.error.hint=this process disconnected remote view controller -- benign unless unexpected, com.apple.ViewBridge.error.description=NSViewBridgeErrorCanceled}
4
0
129
Jul ’25