Delve into the world of graphics and game development. Discuss creating stunning visuals, optimizing game mechanics, and share resources for game developers.

All subtopics
Posts under Graphics & Games topic

Post

Replies

Boosts

Views

Activity

SpriteKit scene used as SCNView.overlaySKScene crashes due to SKShapeNode
I recently published my first game on the App Store. It uses SceneKit with a SpriteKit overlay. All crashes Xcode downloaded for it so far are related to some SpriteKit/SceneKit internals. The most common crash is caused by SKCShapeNode::_NEW_copyRenderPathData. What could cause such a crash? crash.crash While developing this game (and the BoardGameKit framework that appears in the crash log) over the years I experienced many crashes presumably caused by the SpriteKit overlay (I opened a post SceneKit app randomly crashes with EXC_BAD_ACCESS in jet_context::set_fragment_texture about such a crash in September 2024), and other people on the internet also mention that they experience crashes when using SpriteKit as a SceneKit overlay. Should I use a separate SKView and lay it on top of SCNView rather than setting SCNView.overlaySKScene? That seemed to solve the crashes for a guy on stackoverflow, but is it also encouraged by Apple? I know SceneKit is deprecated, but according to Apple critical bugs would still be fixed. Could this be considered a critical bug?
4
0
588
1w
PhotogrammetrySession fails with internal errors 4011 / 4012 when using iOS Object Capture (Area Mode) images
Hi all, I’m running into an issue when trying to reconstruct a 3D model using PhotogrammetrySession on macOS from a set of images captured via the iOS Object Capture sample app, specifically in Area mode. When I attempt to create the model from these images (using the raw Images/ folder exported directly from the capture session), I get the following errors: ERROR cv3dapi.pg: Internal error codes (2): 4011 4012 WARN cv3dapi.pg: Internal warning codes (1): 4507 Output error with code = -15 requestError: CoreOC.PhotogrammetrySession.Error.processError I use the "Images" directory directly exported from Object Capture with my iphone 12 pro max (has lidar) set to "area mode" in the object capture app here is an example heic image metadata from the sequence. heif-info Images/00044.869568833.HEIC MIME type: image/heic main brand: heic compatible brands: mif1, MiHE, MiPr, miaf, MiHB, heic image: 3024x4032 (id=49), primary tiles: 6x8, tile size: 512x512 colorspace: YCbCr, 4:2:0 bit depth: 8 thumbnail: 240x320 color profile: nclx alpha channel: no depth channel: yes size: 192x256 bits per pixel: 8 z-near: 1.173828 z-far: 2.552734 d-min: undefined d-max: undefined representation: uniform Z metadata: Exif: 960 bytes uri /tag:apple.com,2023:ObjectCapture#CameraTrackingState: 4 bytes uri /tag:apple.com,2023:ObjectCapture#CameraCalibrationData: 1015 bytes uri /tag:apple.com,2023:ObjectCapture#ObjectTransform: 48 bytes uri /tag:apple.com,2023:ObjectCapture#ObjectBoundingBox: 48 bytes uri /tag:apple.com,2023:ObjectCapture#RawFeaturePoints: 832 bytes uri /tag:apple.com,2023:ObjectCapture#PointCloudData: 23984 bytes uri /tag:apple.com,2023:ObjectCapture#BundleVersion: 5 bytes uri /tag:apple.com,2023:ObjectCapture#SegmentID: 4 bytes uri /tag:apple.com,2024:ObjectCapture#SessionUUID: 36 bytes uri /tag:apple.com,2024:ObjectCapture#CaptureMode: 4 bytes uri /tag:apple.com,2023:ObjectCapture#Feedback: 4 bytes uri /tag:apple.com,2023:ObjectCapture#WideToDepthCameraTransform: 48 bytes uri /tag:apple.com,2023:ObjectCapture#TemporalDepthPointClouds: 864026 bytes transformations: angle (ccw): 270 region annotations: none properties: camera intrinsic matrix: focal length: 2813.695557; 2813.695557 principal point: 1522.338502; 2002.843018 skew: 0.000000 camera extrinsic matrix: rotation matrix: -0.695 0.344 -0.632 0.007 -0.875 -0.483 -0.719 -0.340 0.606 Questions: • What do internal error codes 4011 and 4012 refer to? • Is there something specific about Area mode captures that require preprocessing before they’re compatible with PhotogrammetrySession? • Has anyone successfully reconstructed a model from an Area mode session using the stock Apple tools? NOTE: I can provide the folder of images for debugging if that would help!
1
2
972
Jul ’25
Can't remove annotations from PdfView
Hi everyone, I faced an issue that on IOS 26 removeAnnotation method doesn't remove annotation. This code worked on previous versions (IOS 18, 17) but suddenly stopped working on IOS 26. Has anyone faced this issue? guard let document = await pdfView.document else { return } for pageIndex in 0..<document.pageCount { guard let page = document.page(at: pageIndex) else { continue } let annotations = page.annotations for annotation in annotations { page.removeAnnotation(annotation) } }
1
1
263
Oct ’25
Looking for some clarification
Was wondering if anyone from Apple could provide some clarification, The gaming studio "Epic Games" Is wondering if they could distribute the award winning game "Fortnite" back on MacOS without any retaliations. I know Fortnite being back on MacOS would benefit thousands of MacOS Devs. Hoping to get a clarification so Epic could start on bringing Fortnite back.
0
1
633
Dec ’25
Request: Allow Game Mode to be enabled locally for non-game App Store categories
Hi Apple team, Game Mode was introduced in iOS 18. To activate Game Mode, an app must include specific key-value pairs in its *.plist and be categorized as a "Game" on the App Store. My app (https://apps.apple.com/us/app/voidlink/id6747717070) works primarily as a self-hosted game streaming (PC->iPhone/iPad) client. Game Mode provides clear benefits in terms of latency and frame rate stability, but it can currently only be activated when running via Xcode or TestFlight. I am an individual iOS developer based in China, where an additional government license is required for apps to be listed under the "Game" category on the App Store. Obtaining such a license is very difficult for independent developers, so my app has been categorized under "Utilities" instead.(If move the app to game category, it will disappear from Chinese App Store immediately) Expectation / Suggestion: Please consider making Game Mode available as a local, user-controllable option on iOS18/26+, such as through a system “App Pool” where users can choose which apps to enable Game Mode for, regardless of App Store category. This would greatly benefit use cases like streaming clients, benchmarking tools, and remote play utilities, without requiring developers to reclassify their apps as “Games” on App Store.
3
1
748
Nov ’25
RealityKit and USDZ: Winding Order Issue with Negatively Scaled Meshes
Hi all, I've encountered a potential issue with how the winding order of geometry is handled when their transformations involve negative scaling. I created a simple test asset, a single triangle, to demonstrate this. The triangle's vertices are defined in a counter-clockwise ("right-handed") winding order, and its transform has a negative scale on the X-axis. According to the OpenUSD specification, this negative determinant in the transformation matrix should effectively reverse the winding order of the geometry: However, any given gprim's local-to-world transformation can flip its effective orientation, when it contains an odd number of negative scales. This condition can be reliably detected using the (Jacobian) determinant of the local-to-world transform: if the determinant is less than zero, then the gprim's orientation has been flipped, and therefore one must apply the opposite handedness rule when computing its surface normals (or just flip the computed normals) for the purposes of hidden surface detection and lighting calculations. When I view the asset in tools like Blender or Preview on macOS, it behaves as expected. The triangle's effective orientation is flipped to CW. However, when the same asset is viewed in Reality Composer Pro or with QuickLook on iOS, its effective orientation remains CCW. In other words, the triangle faces the opposite direction. My questions for the community and Apple are: Is this behavior in RealityKit a known issue? If this is a known issue, is there official guidance for DCC tools on how to export USDZ assets to ensure they appear correctly in the Apple ecosystem? Any insights or recommendations would be greatly appreciated.
5
0
974
Nov ’25
Portals do not occlude CollisionComponent and InputTargetComponent
Hello If you add a ModelEntity to a world inside a portal, the drawing of the model will be occluded properly to the portal bounds. However the invisible shape of the InputTargetComponent and CollisionComponent are not occluded. They are able to cross the portal, and if you have gestures on your ModelEntity you can trigger them in areas outside the portal bounds. This happens even if the ModelEntity has no PortalCrossingComponent.
0
1
86
Mar ’25
The delay issue of 4G TCP connection for iPhone 17 in China's mobile network
Reproduce Same SIM card with 4G, same testing location, connected to the same server, xcode debugging game applications, network/profile retrotransmitted, Avg round trip to view data iPhone17, Turn off 4G and turn on WiFi. All the above indicators are acceptable iPhone17, Turn on 4G, turn off WiFi, retry with retransmission and very high Avg round trip iPhone14-16, Turn on 4G and turn off WiFi. All the above indicators are acceptable App Unity3d project .netframe4.0 C# Socket Other Many developers in Chinese forums have provided feedback on this issue
2
0
773
Oct ’25
Value of type 'SCRecordingOutput' has no member 'delegate'
Hello, I am trying to capture screen recording ( output.mp4 ) using ScreenCaptureKit and also the mouse positions during the recording ( mouse.json ). The recording and the mouse positions ( tracked based on mouse movements events only ) needs to be perfectly synced in order to add effects in post editing. I started off by using the await stream?.startCapture() and after that starting my mouse tracking function :- try await captureEngine.startCapture(configuration: config, filter: filter, recordingOutput: recordingOutput) let captureStartTime = Date() mouseTracker?.startTracking(with: captureStartTime) But every time I tested, there is a clear inconsistency in sync between the recorded video and the recorded mouse positions. The only thing I want is to know when exactly does the recording "actually" started so that I can start the mouse capture at that same time, and thus I tried using the Delegates, but being able to set them up perfectly. import Foundation import AVFAudio import ScreenCaptureKit import OSLog import Combine class CaptureEngine: NSObject, @unchecked Sendable { private let logger = Logger() private(set) var stream: SCStream? private var streamOutput: CaptureEngineStreamOutput? private var recordingOutput: SCRecordingOutput? private let videoSampleBufferQueue = DispatchQueue(label: "com.francestudio.phia.VideoSampleBufferQueue") private let audioSampleBufferQueue = DispatchQueue(label: "com.francestudio.phia.AudioSampleBufferQueue") private let micSampleBufferQueue = DispatchQueue(label: "com.francestudio.phia.MicSampleBufferQueue") func startCapture(configuration: SCStreamConfiguration, filter: SCContentFilter, recordingOutput: SCRecordingOutput) async throws { // Create the stream output delegate. let streamOutput = CaptureEngineStreamOutput() self.streamOutput = streamOutput do { stream = SCStream(filter: filter, configuration: configuration, delegate: streamOutput) try stream?.addStreamOutput(streamOutput, type: .screen, sampleHandlerQueue: videoSampleBufferQueue) try stream?.addStreamOutput(streamOutput, type: .audio, sampleHandlerQueue: audioSampleBufferQueue) try stream?.addStreamOutput(streamOutput, type: .microphone, sampleHandlerQueue: micSampleBufferQueue) self.recordingOutput = recordingOutput recordingOutput.delegate = self try stream?.addRecordingOutput(recordingOutput) try await stream?.startCapture() } catch { logger.error("Failed to start capture: \(error.localizedDescription)") throw error } } func stopCapture() async throws { do { try await stream?.stopCapture() } catch { logger.error("Failed to stop capture: \(error.localizedDescription)") throw error } } func update(configuration: SCStreamConfiguration, filter: SCContentFilter) async { do { try await stream?.updateConfiguration(configuration) try await stream?.updateContentFilter(filter) } catch { logger.error("Failed to update the stream session: \(String(describing: error))") } } func stopRecordingOutputForStream(_ recordingOutput: SCRecordingOutput) throws { try self.stream?.removeRecordingOutput(recordingOutput) } } // MARK: - SCRecordingOutputDelegate extension CaptureEngine: SCRecordingOutputDelegate { func recordingOutputDidStartRecording(_ recordingOutput: SCRecordingOutput) { let startTime = Date() logger.info("Recording output did start recording \(startTime)") } func recordingOutputDidFinishRecording(_ recordingOutput: SCRecordingOutput) { logger.info("Recording output did finish recording") } func recordingOutput(_ recordingOutput: SCRecordingOutput, didFailWithError error: any Error) { logger.error("Recording output failed with error: \(error.localizedDescription)") } } private class CaptureEngineStreamOutput: NSObject, SCStreamOutput, SCStreamDelegate { private let logger = Logger() override init() { super.init() } func stream(_ stream: SCStream, didOutputSampleBuffer sampleBuffer: CMSampleBuffer, of outputType: SCStreamOutputType) { guard sampleBuffer.isValid else { return } switch outputType { case .screen: break case .audio: break case .microphone: break @unknown default: logger.error("Encountered unknown stream output type:") } } func stream(_ stream: SCStream, didStopWithError error: Error) { logger.error("Stream stopped with error: \(error.localizedDescription)") } } I am getting error Value of type 'SCRecordingOutput' has no member 'delegate' Even though I am targeting macOs 15+ ( macOs 26 actually ) and macOs only. What is the best way to achieving the desired result? Is there any other / better way to do it?
1
0
256
Oct ’25
Issues building Unity plug-in project: Cannot locate native library Apple.Core/Apple.GameKit for iOS
I'm having issues getting a well built package from the Apple Unity Plug-in project. When building the my game project in Unity the following error is printed to the console: Apple.Core.AppleNativeLibraryUtility] Cannot locate a Debug or Release Apple.Core native library for iOS. Please ensure that the build invocation (build.py, xcodebuild, or Xcode) compiled cleanly and that the build was configured to support Debug on iOS. As far as I can tell the build did compile cleanly, but I might be missing something. If anyone can see what I'm doing wrong or has any insight it would be greatly appreciated. Setup is the following: macOS Tahoe 26 Beta Xcode-beta Version 26.0 beta 3 (17A5276g) Unity Plug-in branch: 2025-beta1 Unity game project version: 2022.3.60f M1 Macbook Pro The built packages have been imported into the game project through the Unity Package Manager using the tarball option pointing to the built packages from the Unity Plug-in project. The Unity Plug-in project has been built using the build.py file with the following: python3 build.py -m iOS iPhoneSimulator -p Core GameKit CoreHaptics GameController -k all The output is available in the attached file. build-output.txt Here's an image of the NativeLibraries~ folder inside the built Apple.Core package.
6
1
1.2k
Oct ’25
[Bug] iPadOS 26: 4-Finger Fast Tap/Swipe Gesture Not Detected (Multitouch Issue)
Hi everyone I'm experiencing an issue with iPadOS 26 regarding multi-touch gesture detection. When performing a quick four-finger gesture (tap and swipe), the system often fails to recognize the input. This especially affects multi-touch gestures, such as rhythm games with difficult levels. Steps to Reproduce: Place four fingers on the screen. Perform a quick tap or a quick horizontal swipe (like the one used to switch apps). Observe whether the gesture is ignored or detected inconsistently. Expected Behavior: 4-finger multitouch gestures should be recognized regardless of gesture speed, just like previous iPadOS versions. Actual Behavior: Gestures fail to be detected when executed quickly—same gestures still work, and miss notes in rhythm games. You can check out my posts on Twitter/x and Facebook: [https://x.com/kokona_fwa/status/1978131164104728949?s=61] Facebook: [https://m.facebook.com/groups/idipad/permalink/24438964899058806/?]
1
1
1k
Oct ’25
How to use MetalPeformancePrimitives
I am trying to learn the new Metal Peformance Primitives APIs. I have added the MetalPeformancePrimitives framework and included the header in my shader code as per documentation #include <MetalPeformancePrimitives/MetalPeformancePrimitives.h> Unfortunately, Xcode complains that the header cannot be found. How do I include it properly? I am using Xcode 26 on Tahoe. The MetalPeformancePrimitives framework is present on my machine and I can inspect the headers in the filesystem.
3
1
746
Oct ’25
Race conditions when changing CAMetalLayer.drawableSize?
Is the pseudocode below thread-safe? Imagine that the Main thread sets the CAMetalLayer's drawableSize to a new size meanwhile the rendering thread is in the middle of rendering into an existing MTLDrawable which does still have the old size. Is the change of metalLayer.drawableSize thread-safe in the sense that I can present an old MTLDrawable which has a different resolution than the current value of metalLayer.drawableSize? I assume that setting the drawableSize property informs Metal that the next MTLDrawable offered by the CAMetalLayer should have the new size, right? Is it valid to assume that "metalLayer.drawableSize = newSize" and "metalLayer.nextDrawable()" are internally synchronized, so it cannot happen that metalLayer.nextDrawable() would produce e.g. a MTLDrawable with the old width but with the new height (or a completely invalid resolution due to potential race conditions)? func onWindowResized(newSize: CGSize) { // Called on the Main thread metalLayer.drawableSize = newSize } func onVsync(drawable: MTLDrawable) { // Called on a background rendering thread renderer.renderInto(drawable: drawable) }
1
1
489
Dec ’25
Request to restore full ICC profile support (LUT-based display profiles) in macOS ColorSync
Dear Apple Color Management Team, I’m a professional visual creator working on color-critical photo and graphic projects using macOS (currently 26.1 Tahoe). In recent macOS releases, LUT-based ICC display profiles (such as XYZ LUT + Matrix types generated by DisplayCAL or professional spectrophotometers) can no longer be installed or activated via ColorSync. This limitation significantly affects professional workflows in photography, graphic design, prepress, and video color grading — fields that rely on precise display profiling. The current workaround (converting LUT profiles to simple shaper/matrix ICC v2) results in less accurate tone response and color reproduction, particularly in the dark range and wide-gamut displays. I kindly request Apple to restore or re-enable the ability to install and use ICC v2/v4 LUT-based display profiles under ColorSync, as was possible on macOS Monterey and Ventura. This would allow professionals to continue using trusted calibration tools such as DisplayCAL, X-Rite i1Profiler, and Calibrite Profiler to achieve accurate color management. macOS is widely used in professional creative industries, and restoring this feature would be a huge help for countless photographers, designers, and colorists. Thank you for your attention and commitment to professional users. Best regards, Richárd Deutsch Professional Photographer https://riccio.hu/ MacBook Pro (M4 Pro, macOS 26.1)
7
0
1.6k
Dec ’25
Slow compilation
Hi, I am working with a large project. We are compiling each material to its own .metallib. They all include many common files full of inline functions. Finally we link it all together at the end with a single big pathtrace kernel. Everything works as expected, however the compile times have gotten completely out of hand and it takes multiple minutes to compile at runtime (to native code). I have gathered that I can do this offline by using metal-tt however if I am wondering if there is a way to reduce the compile times in such a scenario, and how to investigate what the root cause of the problem is. I suspect it could have to do with the fact that every materials metallib contains duplications of all the inline functions. Any ideas on how to profile and debug this? Thanks, Rasmus
0
1
129
Mar ’25