Porting a heavyweight, industry standard application like this can be daunting with the default command line tools/workflows of the Game Porting Toolkit, but a little bit of greasing the wheels never hurts:
installaware.com/blog/?p=828
Enjoy the terrific screenshots of UL's 3DMark running on bare metal (pun intended!) on macOS.
The amazing work done by Gokhan Avkarogullari in building Apple Silicon GPUs shines throughout the article.
The future of gaming on macOS is bright!
Delve into the world of graphics and game development. Discuss creating stunning visuals, optimizing game mechanics, and share resources for game developers.
Selecting any option will automatically load the page
Post
Replies
Boosts
Views
Activity
I have a Core Image filter in my app that uses Metal. I cannot compile it because it complains that the executable tool metal is not available, but I have installed it in Xcode.
If I go to the "Components" section of Xcode Settings, it shows it as downloaded. And if I run the suggested command, it also shows it as installed. Any advice?
Xcode Version
Version 26.0 beta (17A5241e)
Build Output
Showing All Errors Only
Build target Lessons of project StudyJapanese with configuration Light
RuleScriptExecution /Users/chris/Library/Developer/Xcode/DerivedData/StudyJapanese-glbneyedpsgxhscqueifpekwaofk/Build/Intermediates.noindex/StudyJapanese.build/Light-iphonesimulator/Lessons.build/DerivedSources/OtsuThresholdKernel.ci.air /Users/chris/Code/SerpentiSei/Shared/iOS/CoreImage/OtsuThresholdKernel.ci.metal normal undefined_arch (in target 'Lessons' from project 'StudyJapanese')
cd /Users/chris/Code/SerpentiSei/StudyJapanese
/bin/sh -c xcrun\ metal\ -w\ -c\ -fcikernel\ \"\$\{INPUT_FILE_PATH\}\"\ -o\ \"\$\{SCRIPT_OUTPUT_FILE_0\}\"'
'
error: error: cannot execute tool 'metal' due to missing Metal Toolchain; use: xcodebuild -downloadComponent MetalToolchain
/Users/chris/Code/SerpentiSei/StudyJapanese/error:1:1: cannot execute tool 'metal' due to missing Metal Toolchain; use: xcodebuild -downloadComponent MetalToolchain
Build failed 6/9/25, 8:31 PM 27.1 seconds
Result of xcodebuild -downloadComponent MetalToolchain (after switching Xcode-beta.app with xcode-select)
xcodebuild -downloadComponent MetalToolchain
Beginning asset download...
Downloaded asset to: /System/Library/AssetsV2/com_apple_MobileAsset_MetalToolchain/4d77809b60771042e514cfcf39662c6d1c195f7d.asset/AssetData/Restore/022-19457-035.dmg
Done downloading: Metal Toolchain (17A5241c).
Screenshots from Xcode
Result of "Copy Information"
Metal Toolchain 26.0 [com.apple.MobileAsset.MetalToolchain: 17.0 (17A5241c)] (Installed)
Hello,
I have noticed a performance drop on SpriteKit-based projects running on iOS 26.0 (23A341).
Below is a SpriteKit scene used to test framerate on different devices:
import SpriteKit
import SwiftUI
class BareboneScene: SKScene {
override func didMove(to view: SKView) {
size = view.bounds.size
anchorPoint = CGPoint(x: 0.5, y: 0.5)
backgroundColor = .darkGray
let roundedSquare = SKShapeNode(rectOf: CGSize(width: 150, height: 75), cornerRadius: 12)
roundedSquare.fillColor = .systemRed
roundedSquare.strokeColor = .black
roundedSquare.lineWidth = 3
addChild(roundedSquare)
let action = SKAction.rotate(byAngle: .pi, duration: 1)
roundedSquare.run(.repeatForever(action))
}
}
struct BareboneSceneView: View {
var body: some View {
SpriteView(
scene: BareboneScene(),
debugOptions: [.showsFPS]
)
.ignoresSafeArea()
}
}
#Preview {
BareboneSceneView()
}
The scene is very simple, yet framerate drops to ~40 fps as shown by the Metal HUD. Tested on:
iPhone 13, iOS 26.0: framerate drops to 40 fps. Sometimes it runs at near 60fps. But if the screen is touched repeatedly, the framerate drops to 40-50 fps again.
iPhone 11 Pro, iOS 26.0: ~40fps.
iPad 9th Gen, iOS 18.6.2: 60fps, no issues.
See screenshots attached. These numbers were observed by me and members of our beloved SpriteKit Discord server.
Thank you for your attention.
Breaking Through PolySpatial's ~8k Object Limit – Seeking Alternative Approaches for Large-Scale Digital Twins
Confirmed: PolySpatial make Doubles MeshFilter Count – Hard Limit at ~8k Active Objects (15.9k Total)
Project Context & Research Goals
I’m developing an industrial digital twin application for Apple Vision Pro using Unity’s PolySpatial framework (RealityKit rendering in Unbounded_Volume mode). The scene contains complex factory environments with:
Production line equipment Many fragmented grid objects need to be merged.)
Dynamic product racks (state-switchable assets)
Animated worker avatars
To optimize performance, I’m systematically testing visionOS’s rendering capacity limits. Through controlled stress tests, I’ve identified a critical threshold:
Key Finding
When the total MeshFilter count reaches 15,970 (system baseline + 7,985 user-created objects × 2 due to PolySpatial cloning), the application crashes consistently. This suggests:
PolySpatial’s mirroring mechanism effectively doubles GameObject overhead
An apparent hard limit exists around ~8k active mesh objects in practice
Objectives for This Discussion
Verify if others have encountered similar limits with PolySpatial/RealityKit
Understand whether this is a:
Memory constraint (per-app allocation)
Render pipeline limit (Metal draw calls)
Unity-specific PolySpatial behavior
Explore optimization strategies beyond brute-force object reduction
Why This Matters
Industrial metaverse applications require rendering thousands of interactive objects . Confirming these limits will help our team:
Design safer content guidelines
Prioritize GPU instancing/LOD investments
Potentially contribute back to PolySpatial’s optimization
I’d appreciate insights from engineers who’ve:
Pushed similar large-scale scenes in visionOS
Worked around PolySpatial’s cloning overhead
Discovered alternative capacity limits (vertices/draw calls)
Hi everyone,
This project uses PyTorch on an Apple Silicon Mac (M1/M2/etc.), and the goal is to use the MPS backend for GPU acceleration, notes Apple Developer. However, the workflow depends on Float64 (double-precision) floating-point numbers for certain computations, notes PyTorch Forums.
The error "Cannot convert a MPS Tensor to float64 dtype as the MPS framework doesn't support float64. Please use float32 instead" has been encountered, notes GitHub. It seems that the MPS backend doesn't currently support Float64 for direct GPU computation.
Questions for the community:
Are there any known workarounds or best practices for handling Float64-dependent operations when using the MPS backend with PyTorch?
For those working with high-precision tasks on Apple Silicon, what strategies are being used to balance performance with the need for Float64?
Offloading to the CPU is an option, and it's of interest to know if there are any specific techniques or libraries within the Apple ecosystem that could streamline this process while aiming for optimal performance.
Any insights, tips, or experiences would be appreciated.
Thanks in advance,
Jonaid
MacBook Pro M3 Max
Topic:
Graphics & Games
SubTopic:
Metal
Hi,
I have an Unity game. I need to have multiple App Icons for my game for it to be able to be recognized in different countries.
In other words, is it possible to have an iOS app in which the App Icon changes based on device locale/language?
On Android this is possible using Unity Localization package "com.unity.localization"
Topic:
Graphics & Games
SubTopic:
General
Is it possible to start screen recording (through Control Center) without user prompt?
I mean to ask user permission for the first time and after that to start and stop recording programmatically only?
I need to record screen only for specific events.
InstallAware's Application Porting Toolkit embraces the ideas first introduced by the Apple Game Porting Toolkit, extending support to GNU/Linux, and line of business applications.
Bolstered by native code, 100% GUI installations; the porting happens through a 100% graphical, guided process that doesn't require any programming or command line skills - making it far easier to learn and use than the Apple Game Porting Toolkit.
Download your free step-by-step guide at: installaware.com/application-porting-toolkit.pdf
Star the sources and build your own at: github.com/installaware/iamp
On Macs, the Application Porting Toolkit does fully support 3D gaming by atomically installing the Apple Game Porting Toolkit as a standard application runtime, together with the actual game (or Windows application) being ported. Thanks to the inclusion of pre-built GPTk binaries, the installation is instantaneous - without lengthy downloads or local recompiles. You get to customize the instance name of this Windows compatibility layer, too!
Hi,
I can't see RealityKit statistics on Xcode Canvas using:
arView.debugOptions = [.showStatistics]
The statistics only show on a physical device, not Xcode live canvas with #Preview. Testing in Xcode 26.0.1 (17A400) on Tahoe 26.0.1 (25A362).
Use case: I'm using RealityKit as a non-AR 3D engine. Xcode Canvas is useful for live iterations.
Is this expected behavior? How can I see FPS on Xcode canvas? SKView for example shows all debug options on both Xcode Canvas and physical devices.
Topic:
Graphics & Games
SubTopic:
RealityKit
Issues building Unity plug-in project: Cannot locate native library Apple.Core/Apple.GameKit for iOS
I'm having issues getting a well built package from the Apple Unity Plug-in project.
When building the my game project in Unity the following error is printed to the console:
Apple.Core.AppleNativeLibraryUtility] Cannot locate a Debug or Release Apple.Core native library for iOS.
Please ensure that the build invocation (build.py, xcodebuild, or Xcode) compiled cleanly and that the build was configured to support Debug on iOS.
As far as I can tell the build did compile cleanly, but I might be missing something.
If anyone can see what I'm doing wrong or has any insight it would be greatly appreciated.
Setup is the following:
macOS Tahoe 26 Beta
Xcode-beta Version 26.0 beta 3 (17A5276g)
Unity Plug-in branch: 2025-beta1
Unity game project version: 2022.3.60f
M1 Macbook Pro
The built packages have been imported into the game project through the Unity Package Manager using the tarball option pointing to the built packages from the Unity Plug-in project.
The Unity Plug-in project has been built using the build.py file with the following:
python3 build.py -m iOS iPhoneSimulator -p Core GameKit CoreHaptics GameController -k all
The output is available in the attached file.
build-output.txt
Here's an image of the NativeLibraries~ folder inside the built Apple.Core package.
Reproduce
Same SIM card with 4G, same testing location, connected to the same server, xcode debugging game applications, network/profile retrotransmitted, Avg round trip to view data
iPhone17, Turn off 4G and turn on WiFi. All the above indicators are acceptable
iPhone17, Turn on 4G, turn off WiFi, retry with retransmission and very high Avg round trip
iPhone14-16, Turn on 4G and turn off WiFi. All the above indicators are acceptable
App
Unity3d project
.netframe4.0
C# Socket
Other
Many developers in Chinese forums have provided feedback on this issue
Hello,
On macOS 26 (Tahoe), when building a OSX app that includes GameKit code, calling GKLocalPlayer.local.authenticateHandler shows the "Sign In to Game Center" alert (e.g. didShowFullscreenSignIn) — even if the app does not have the Game Center capability enabled or any related entitlement (com.apple.developer.game-center).
This alert only appears when the user is not signed in to Game Center in system settings.
However, when testing the same code path on iOS app built with macOS 26 (Tahoe), the alert does not appear unless the proper capability and entitlement are included.
This behavior is different from macOS 15 (Sequoia) + Xcode 15.x. Prior to the update, Game Center features did not work at all even with the OSX app without Capability and Entitlements.
Steps to Reproduce
Create a new OSX app target (App Sandbox enabled, no Game Center capability).
Add minimal GameKit code:
GKLocalPlayer.local.authenticateHandler = { _, _, _ in }
Build OSX app and run on macOS 26 (Tahoe).
Ensure Game Center is signed out in System Settings.
Observe: “Sign In to Game Center” alert appears automatically.
Expected Behavior
When Game Center capability and entitlement are not present, authenticateHandler should fail silently, and no signIn alert should appear.
Actual Behavior
On OSX app, the Game Center signIn UI appears even without any Game Center capability or entitlement.
On iOS app, this alert does not appear.
*Build Configuration: built with the same condition. (macOS 26 + Xcode 26)
Question
Could you please confirm whether this behavior is an intentional change in macOS 26 or a bug only for OSX apps in the GameKit authentication flow?
Thank you.
Hello, I am trying to capture screen recording ( output.mp4 ) using ScreenCaptureKit and also the mouse positions during the recording ( mouse.json ). The recording and the mouse positions ( tracked based on mouse movements events only ) needs to be perfectly synced in order to add effects in post editing.
I started off by using the await stream?.startCapture() and after that starting my mouse tracking function :-
try await captureEngine.startCapture(configuration: config, filter: filter, recordingOutput: recordingOutput)
let captureStartTime = Date()
mouseTracker?.startTracking(with: captureStartTime)
But every time I tested, there is a clear inconsistency in sync between the recorded video and the recorded mouse positions.
The only thing I want is to know when exactly does the recording "actually" started so that I can start the mouse capture at that same time, and thus I tried using the Delegates, but being able to set them up perfectly.
import Foundation
import AVFAudio
import ScreenCaptureKit
import OSLog
import Combine
class CaptureEngine: NSObject, @unchecked Sendable {
private let logger = Logger()
private(set) var stream: SCStream?
private var streamOutput: CaptureEngineStreamOutput?
private var recordingOutput: SCRecordingOutput?
private let videoSampleBufferQueue = DispatchQueue(label: "com.francestudio.phia.VideoSampleBufferQueue")
private let audioSampleBufferQueue = DispatchQueue(label: "com.francestudio.phia.AudioSampleBufferQueue")
private let micSampleBufferQueue = DispatchQueue(label: "com.francestudio.phia.MicSampleBufferQueue")
func startCapture(configuration: SCStreamConfiguration, filter: SCContentFilter, recordingOutput: SCRecordingOutput) async throws {
// Create the stream output delegate.
let streamOutput = CaptureEngineStreamOutput()
self.streamOutput = streamOutput
do {
stream = SCStream(filter: filter, configuration: configuration, delegate: streamOutput)
try stream?.addStreamOutput(streamOutput, type: .screen, sampleHandlerQueue: videoSampleBufferQueue)
try stream?.addStreamOutput(streamOutput, type: .audio, sampleHandlerQueue: audioSampleBufferQueue)
try stream?.addStreamOutput(streamOutput, type: .microphone, sampleHandlerQueue: micSampleBufferQueue)
self.recordingOutput = recordingOutput
recordingOutput.delegate = self
try stream?.addRecordingOutput(recordingOutput)
try await stream?.startCapture()
} catch {
logger.error("Failed to start capture: \(error.localizedDescription)")
throw error
}
}
func stopCapture() async throws {
do {
try await stream?.stopCapture()
} catch {
logger.error("Failed to stop capture: \(error.localizedDescription)")
throw error
}
}
func update(configuration: SCStreamConfiguration, filter: SCContentFilter) async {
do {
try await stream?.updateConfiguration(configuration)
try await stream?.updateContentFilter(filter)
} catch {
logger.error("Failed to update the stream session: \(String(describing: error))")
}
}
func stopRecordingOutputForStream(_ recordingOutput: SCRecordingOutput) throws {
try self.stream?.removeRecordingOutput(recordingOutput)
}
}
// MARK: - SCRecordingOutputDelegate
extension CaptureEngine: SCRecordingOutputDelegate {
func recordingOutputDidStartRecording(_ recordingOutput: SCRecordingOutput) {
let startTime = Date()
logger.info("Recording output did start recording \(startTime)")
}
func recordingOutputDidFinishRecording(_ recordingOutput: SCRecordingOutput) {
logger.info("Recording output did finish recording")
}
func recordingOutput(_ recordingOutput: SCRecordingOutput, didFailWithError error: any Error) {
logger.error("Recording output failed with error: \(error.localizedDescription)")
}
}
private class CaptureEngineStreamOutput: NSObject, SCStreamOutput, SCStreamDelegate {
private let logger = Logger()
override init() {
super.init()
}
func stream(_ stream: SCStream, didOutputSampleBuffer sampleBuffer: CMSampleBuffer, of outputType: SCStreamOutputType) {
guard sampleBuffer.isValid else { return }
switch outputType {
case .screen:
break
case .audio:
break
case .microphone:
break
@unknown default:
logger.error("Encountered unknown stream output type:")
}
}
func stream(_ stream: SCStream, didStopWithError error: Error) {
logger.error("Stream stopped with error: \(error.localizedDescription)")
}
}
I am getting error
Value of type 'SCRecordingOutput' has no member 'delegate'
Even though I am targeting macOs 15+ ( macOs 26 actually ) and macOs only.
What is the best way to achieving the desired result? Is there any other / better way to do it?
Hi — we’re testing our app on iOS 26 and ran into strange behavior with GKLocalPlayer.local.authenticateHandler.
GKLocalPlayer.local.authenticateHandler = { [weak self] viewController, error in
// additional code
}
What happens:
When we assign authenticateHandler on iOS 26 and the user is not signed in to Game Center, the system shows a full-screen Game Center overlay asking the user to sign in.
If the user taps Cancel, nothing further happens — the closure is not invoked again, so we don’t receive an error or any callback. The app never learns whether the auth was cancelled or failed.
In previous iOS versions the closure was called (with viewController / error as appropriate) and the flow worked as expected.
What we tried:
Verified authenticateHandler is being set.
Checked GKLocalPlayer.local.isAuthenticated after the overlay dismisses — it’s unchanged.
Observed system logs: a com.apple.GameOverlayUI scene is created and later removed (so the auth overlay is shown by the system).
Confirmed the same code works on earlier iOS versions. :thinking:
Question:
Has anyone seen authenticateHandler not being invoked on iOS 26 when the Game Center auth overlay is presented? Could this be a behavioral change in iOS 26 (overlay runs in a separate system process), or a bug? Any suggested workarounds to reliably detect that the user cancelled the sign-in (for example: listening for willResignActive / didBecomeActive, watching for a system overlay, or saving/presenting the viewController manually)?
Thanks in advance for any advice — we’d appreciate pointers or suggested diagnostics ?
I'm experiencing an issue with PDFKit where page.removeAnnotation(annotation) successfully removes the annotation from the page's data structure, but the PDFView no longer updates automatically to reflect the change visually.
Issue Details:
The annotation is removed (verified by checking page.annotations.count)
The PDFView display doesn't refresh to show the removal
This code was working correctly before and suddenly stopped working
No code changes were made on my end
I have published a number of games that use SpriteKit for everything important. Since the release of macOS Tahoe, I've had a lot of end user reports saying that sound effects have stopped working in many (but not all) of my titles.
I'm not doing anything unusual here – typical code is:
sndGameOver = [SKAction playSoundFileNamed:@"Audio/GameOver.wav" waitForCompletion:YES];
Then at the appropriate time:
[self runAction:sndGameOver];
Has anyone else encountered this? The code still works fine on previous operating systems, and appears to be fine on iOS too. Has something changed in macOS Tahoe?
I'm at a bit of a loss. There's nothing obviously different between the titles that do work and the titles that don't.
Suggestions welcomed!
Thanks
Hi everyone,
I faced an issue that on IOS 26 removeAnnotation method doesn't remove annotation. This code worked on previous versions (IOS 18, 17) but suddenly stopped working on IOS 26.
Has anyone faced this issue?
guard let document = await pdfView.document else { return }
for pageIndex in 0..<document.pageCount {
guard let page = document.page(at: pageIndex) else { continue }
let annotations = page.annotations
for annotation in annotations {
page.removeAnnotation(annotation)
}
}
Hello,
I've been trying to leverage instanced rendering in RealityKit on visionOS but have not had success.
RealityKit states this is supported:
https://developer.apple.com/documentation/realitykit/validating-usd-files
https://developer.apple.com/videos/play/wwdc2021/10075/?time=1373
https://developer.apple.com/videos/play/wwdc2023/10099/?time=772
RealityKit Trace metrics
Validating instancing is working:
To test I made a base visionOS app with immersive space and the entity replaced with my test usdz file. I've been using the RealityKit Trace profiling template in xcode instruments in the immersive space and volume closed. This gets consistent draw call results.
If I have a single sphere mesh with one material I get one draw call, but the number of draw calls grows linearly with mesh count no matter how my entity is configured.
What I've tried
Create a test scene in blender, export with instancing enabled
Create a test scene in Reality Composer Pro using references
Author usda files by hand based on the OpenUSD spec
Programatically create a MeshResource with Contents at runtime
References
https://openusd.org/release/api/_usd__page__scenegraph_instancing.html
https://developer.apple.com/documentation/realitykit/meshresource
https://developer.apple.com/documentation/realitykit/meshresource/instance
Thank you
Hi everyone
I'm experiencing an issue with iPadOS 26 regarding multi-touch gesture detection. When performing a quick four-finger gesture (tap and swipe), the system often fails to recognize the input. This especially affects multi-touch gestures, such as rhythm games with difficult levels.
Steps to Reproduce:
Place four fingers on the screen.
Perform a quick tap or a quick horizontal swipe (like the one used to switch apps).
Observe whether the gesture is ignored or detected inconsistently.
Expected Behavior:
4-finger multitouch gestures should be recognized regardless of gesture speed, just like previous iPadOS versions.
Actual Behavior:
Gestures fail to be detected when executed quickly—same gestures still work, and miss notes in rhythm games.
You can check out my posts on Twitter/x and Facebook: [https://x.com/kokona_fwa/status/1978131164104728949?s=61]
Facebook: [https://m.facebook.com/groups/idipad/permalink/24438964899058806/?]