I am trying to load some PNG data with MTKTextureLoader newTextureWithData,but the result shows wrong at the alpha area.
Here is the code. I have an image URL, after it downloads successfully, I try to use the data or UIImagePNGRepresentation (image), they all show wrong.
UIImage *tempImg = [UIImage imageWithData:data];
CGImageRef cgRef = tempImg.CGImage;
MTKTextureLoader *loader = [[MTKTextureLoader alloc] initWithDevice:device];
id<MTLTexture> temp1 = [loader newTextureWithData:data options:@{MTKTextureLoaderOptionSRGB: @(NO), MTKTextureLoaderOptionTextureUsage: @(MTLTextureUsageShaderRead), MTKTextureLoaderOptionTextureCPUCacheMode: @(MTLCPUCacheModeWriteCombined)} error:nil];
NSData *tempData = UIImagePNGRepresentation(tempImg);
id<MTLTexture> temp2 = [loader newTextureWithData:tempData options:@{MTKTextureLoaderOptionSRGB: @(NO), MTKTextureLoaderOptionTextureUsage: @(MTLTextureUsageShaderRead), MTKTextureLoaderOptionTextureCPUCacheMode: @(MTLCPUCacheModeWriteCombined)} error:nil];
id<MTLTexture> temp3 = [loader newTextureWithCGImage:cgRef options:@{MTKTextureLoaderOptionSRGB: @(NO), MTKTextureLoaderOptionTextureUsage: @(MTLTextureUsageShaderRead), MTKTextureLoaderOptionTextureCPUCacheMode: @(MTLCPUCacheModeWriteCombined)} error:nil];
}] resume];
Delve into the world of graphics and game development. Discuss creating stunning visuals, optimizing game mechanics, and share resources for game developers.
Selecting any option will automatically load the page
Post
Replies
Boosts
Views
Activity
Hi,
Introducing Swift Concurrency to my Metal app has been a bit challenging as Swift Concurrency is limited by the cooperative thread pool.
GPU work is obviously not CPU bound and can block forward moving progress, especially when using waitUntilCompleted on the command buffer. For concurrent render work this has the potential of under utilizing the CPU and even creating dead locks.
My question is, what is the Metal's teams general recommendation when it comes to concurrency? It seems to me that Dispatch or OperationQueues are still the preferred way for Metal bound tasks in order to gain maximum performance?
To integrate with Swift Concurrency my idea is to use continuations that kick off render jobs via Dispatch or Queues? Would this be the best solution to bridge async tasks with Metal work?
Thanks!
I am working on a custom resolve tile shader for a client. I see a big difference in performance depending on where we write to:
1- the resolve texture of the color attachment
2- a rw tile shader texture set via [renderEncoder setTileTexture: myResolvedTexture]
Option 2 is more than twice as slow than option 1.
Our compute shader writes to 4 UAVs so just using the resolve texture entry is not possible.
Why such a difference as there is no more data being written? Can option 2 be as fast as option 1?
I can demonstrate the issue in a modified version of the Multisample code sample.
Apple, please bring back SceneKit.
In SceneKit, when creating an .scn file from a rigged model, the framework created an SCNNode for each bone/joint, so you could add and remove child nodes directly to and from joints, and like any other SCNNode, you could access world position and world orientation for each joint. The analog would be for joints to be accessible as child entities of a ModelEntity in RealityKit. I am unable to proceed with migrating my project from SceneKit because of this, as there does not seem to be a way to even access the true world position of a joint with the current jointNames/jointTransforms paradigm.
The translation information from the given transforms is insufficient to determine the location of a joint at any given time, and other approaches like creating a GeometricPin for the given joint name and attaching it to another entity do not seem to work. So conveniently being able to attach an item to the hand of a rigged model was trivial in SceneKit and now feels impossible in RealityKit.
I am not the first person to notice this, and am feeling demoralized about proceeding with RealityKit with such a critical piece of functionality blocked
https://stackoverflow.com/questions/76726241/how-do-i-attach-an-entity-to-a-skeletons-joint-in-realitykit
Will this be addressed in some way?
I'm updating our app to support metal 4, but the metal 4 types don't seem to get recognized when targeting simulator. Is it known if metal 4 will be supported in the near future, or am I setting up the app wrong?
Has anyone come across the issue that setting GKLocalPlayer.local.authenticateHandler breaks a RealityView's world tracking on iOS / iPadOS 18 beta 5?
I'm in the process of upgrading my app to make use of the much appreciated RealityView unification, using RealityView not only on visionOS but now also on iOS and iPadOS. In my RealityView, I enable world tracking on iOS like this:
content.camera = .worldTracking
However, device position and orientation were ignored (the camera remained static) and there was no camera pass-through. Then I discovered that the issue disappeared when I remove the line
GKLocalPlayer.local.authenticateHandler = { viewController, error in
// ... some more code ...
}
So I filed FB14731139 and hope that it will be resolved before the release of iOS / iPadOS 18.
Hello
XQuartz is an open-source effort to develop a version of the X.Org X Window System (https://www.xquartz.org/), widely used to bring graphical support to applications running in remote servers (usually via SSH).
Since macOS Tahoe, XQuartz fails to refresh properly on window resize (more info here https://github.com/XQuartz/XQuartz/issues/438#issuecomment-3371409500), leading to severe usability issues.
The XQuartz developers are already aware of the issue, but I’m wondering if there’s anything we can do at the OS level to resolve it and restore the usual behavior from before macOS Tahoe.
Thanks,
KiM
Topic:
Graphics & Games
SubTopic:
General
Breaking Through PolySpatial's ~8k Object Limit – Seeking Alternative Approaches for Large-Scale Digital Twins
Confirmed: PolySpatial make Doubles MeshFilter Count – Hard Limit at ~8k Active Objects (15.9k Total)
Project Context & Research Goals
I’m developing an industrial digital twin application for Apple Vision Pro using Unity’s PolySpatial framework (RealityKit rendering in Unbounded_Volume mode). The scene contains complex factory environments with:
Production line equipment Many fragmented grid objects need to be merged.)
Dynamic product racks (state-switchable assets)
Animated worker avatars
To optimize performance, I’m systematically testing visionOS’s rendering capacity limits. Through controlled stress tests, I’ve identified a critical threshold:
Key Finding
When the total MeshFilter count reaches 15,970 (system baseline + 7,985 user-created objects × 2 due to PolySpatial cloning), the application crashes consistently. This suggests:
PolySpatial’s mirroring mechanism effectively doubles GameObject overhead
An apparent hard limit exists around ~8k active mesh objects in practice
Objectives for This Discussion
Verify if others have encountered similar limits with PolySpatial/RealityKit
Understand whether this is a:
Memory constraint (per-app allocation)
Render pipeline limit (Metal draw calls)
Unity-specific PolySpatial behavior
Explore optimization strategies beyond brute-force object reduction
Why This Matters
Industrial metaverse applications require rendering thousands of interactive objects . Confirming these limits will help our team:
Design safer content guidelines
Prioritize GPU instancing/LOD investments
Potentially contribute back to PolySpatial’s optimization
I’d appreciate insights from engineers who’ve:
Pushed similar large-scale scenes in visionOS
Worked around PolySpatial’s cloning overhead
Discovered alternative capacity limits (vertices/draw calls)
In my project I need to do the following:
In runtime create metal Dynamic library from source.
In runtime create metal Executable library from source and Link it with my previous created Dynamic library.
Create compute pipeline using those two libraries created above.
But I get the following error at the third step:
Error Domain=AGXMetalG15X_M1 Code=2 "Undefined symbols:
_Z5noisev, referenced from: OnTheFlyKernel
" UserInfo={NSLocalizedDescription=Undefined symbols:
_Z5noisev, referenced from: OnTheFlyKernel
}
import Foundation
import Metal
class MetalShaderCompiler {
let device = MTLCreateSystemDefaultDevice()!
var pipeline: MTLComputePipelineState!
func compileDylib() -> MTLDynamicLibrary {
let source = """
#include <metal_stdlib>
using namespace metal;
half3 noise() {
return half3(1, 0, 1);
}
"""
let option = MTLCompileOptions()
option.libraryType = .dynamic
option.installName = "@executable_path/libFoundation.metallib"
let library = try! device.makeLibrary(source: source, options: option)
let dylib = try! device.makeDynamicLibrary(library: library)
return dylib
}
func compileExlib(dylib: MTLDynamicLibrary) -> MTLLibrary {
let source = """
#include <metal_stdlib>
using namespace metal;
extern half3 noise();
kernel void OnTheFlyKernel(texture2d<half, access::read> src [[texture(0)]],
texture2d<half, access::write> dst [[texture(1)]],
ushort2 gid [[thread_position_in_grid]]) {
half4 rgba = src.read(gid);
rgba.rgb += noise();
dst.write(rgba, gid);
}
"""
let option = MTLCompileOptions()
option.libraryType = .executable
option.libraries = [dylib]
let library = try! self.device.makeLibrary(source: source, options: option)
return library
}
func runtime() {
let dylib = self.compileDylib()
let exlib = self.compileExlib(dylib: dylib)
let pipelineDescriptor = MTLComputePipelineDescriptor()
pipelineDescriptor.computeFunction = exlib.makeFunction(name: "OnTheFlyKernel")
pipelineDescriptor.preloadedLibraries = [dylib]
pipeline = try! device.makeComputePipelineState(descriptor: pipelineDescriptor, options: .bindingInfo, reflection: nil)
}
}
Hi! I watched the WWDC25 session "Bring your SceneKit project to RealityKit" which seemed like a great resource for those of us transitioning from the now-deprecated SceneKit framework. The session mentioned that the full sample code for the project would be available to download, but I haven't been able to find it in the Code section of the video page or in the Sample Code Library.
Has the sample code been released yet? Having the project code would make it much easier to follow along with the RealityKit changes shown in the video. Thanks again for the great session.
I'm building a game with a client-server architecture. Using GKMatch.chooseBestHostingPlayer(_:) rarely works. When I started testing it today, it worked once at the very beginning, and since then it always succeeds on one client and returns nil on the other client. I'm testing with a Mac and an iPhone. Sometimes it fails on the Mac, sometimes on the iPhone. On the device that it succeeds on, the provided host can be the device itself or the other one.
I created FB9583628 in August 2021, but after the Feedback Assistant team replied that they are not able to reproduce it, the feedback never went forward.
import SceneKit
import GameKit
#if os(macOS)
typealias ViewController = NSViewController
#else
typealias ViewController = UIViewController
#endif
class GameViewController: ViewController, GKMatchmakerViewControllerDelegate, GKMatchDelegate {
var match: GKMatch?
var matchStarted = false
override func viewDidLoad() {
super.viewDidLoad()
GKLocalPlayer.local.authenticateHandler = authenticate
}
private func authenticate(_ viewController: ViewController?, _ error: Error?) {
#if os(macOS)
if let viewController = viewController {
presentAsSheet(viewController)
} else if let error = error {
print(error)
} else {
print("authenticated as \(GKLocalPlayer.local.gamePlayerID)")
let viewController = GKMatchmakerViewController(matchRequest: defaultMatchRequest())!
viewController.matchmakerDelegate = self
GKDialogController.shared().present(viewController)
}
#else
if let viewController = viewController {
present(viewController, animated: true)
} else if let error = error {
print(error)
} else {
print("authenticated as \(GKLocalPlayer.local.gamePlayerID)")
let viewController = GKMatchmakerViewController(matchRequest: defaultMatchRequest())!
viewController.matchmakerDelegate = self
present(viewController, animated: true)
}
#endif
}
private func defaultMatchRequest() -> GKMatchRequest {
let request = GKMatchRequest()
request.minPlayers = 2
request.maxPlayers = 2
request.defaultNumberOfPlayers = 2
request.inviteMessage = "Ciao!"
return request
}
func matchmakerViewControllerWasCancelled(_ viewController: GKMatchmakerViewController) {
print("cancelled")
}
func matchmakerViewController(_ viewController: GKMatchmakerViewController, didFailWithError error: Error) {
print(error)
}
func matchmakerViewController(_ viewController: GKMatchmakerViewController, didFind match: GKMatch) {
self.match = match
match.delegate = self
startMatch()
}
func match(_ match: GKMatch, player: GKPlayer, didChange state: GKPlayerConnectionState) {
print("\(player.gamePlayerID) changed state to \(String(describing: state))")
startMatch()
}
func startMatch() {
let match = match!
if matchStarted || match.expectedPlayerCount > 0 {
return
}
print("starting match with local player \(GKLocalPlayer.local.gamePlayerID) and remote players \(match.players.map({ $0.gamePlayerID }))")
match.chooseBestHostingPlayer { host in
print("host is \(String(describing: host?.gamePlayerID))")
}
}
}
Hello,
I've been trying to leverage instanced rendering in RealityKit on visionOS but have not had success.
RealityKit states this is supported:
https://developer.apple.com/documentation/realitykit/validating-usd-files
https://developer.apple.com/videos/play/wwdc2021/10075/?time=1373
https://developer.apple.com/videos/play/wwdc2023/10099/?time=772
RealityKit Trace metrics
Validating instancing is working:
To test I made a base visionOS app with immersive space and the entity replaced with my test usdz file. I've been using the RealityKit Trace profiling template in xcode instruments in the immersive space and volume closed. This gets consistent draw call results.
If I have a single sphere mesh with one material I get one draw call, but the number of draw calls grows linearly with mesh count no matter how my entity is configured.
What I've tried
Create a test scene in blender, export with instancing enabled
Create a test scene in Reality Composer Pro using references
Author usda files by hand based on the OpenUSD spec
Programatically create a MeshResource with Contents at runtime
References
https://openusd.org/release/api/_usd__page__scenegraph_instancing.html
https://developer.apple.com/documentation/realitykit/meshresource
https://developer.apple.com/documentation/realitykit/meshresource/instance
Thank you
Hello,
I recently watched the WWDC2025 session titled “Combine Metal 4 machine learning and graphics” (https://developer.apple.com/videos/play/wwdc2025/262/ ), and I’m very excited about the new Metal 4 features that integrate machine learning with graphics—such as neural ambient occlusion, shader-based ML inference, and the use of MTLTensor and MTL4MachineLearningCommandEncoder.
While the session includes helpful code snippets and a compelling debug demo (e.g., the neural ambient occlusion example), the implementation details are not fully shown, and I haven’t been able to find a complete, runnable sample project that demonstrates end-to-end integration of ML and rendering in Metal 4.
Would Apple be able to provide a full, working example—such as an Xcode project—that shows how to:
Export a model to an .mlpackage,
Convert it to an .mtlpackage,
Use MTL4MachineLearningCommandEncoder alongside render passes,
Or embed small neural networks directly in shaders using Shader ML?
Having such a sample would greatly help developers like me adopt these powerful new capabilities correctly and efficiently.
Thank you very much for your time and support!
Best regards,
After updating to MacOS 26.1 I encountered an issue that Roblox tends to freeze quite often for 10 - 60 seconds at most, this is really annoying that it is doing this as i play the game a lot. My theory is that it is like a driver issue with metal or something, I have reinstalled MacOS, reinstalled the game and lowed the performance manually but nothing is working.
Wondering if you could help, when it will be fixed and if others are having the same issue.
Many thanks, William.
I have a Mac Studio 2023 M2 Max
Running Sonoma 14.6.1
Developing in XCode 16.1
It seems that the NSScreen frame settings may be incorrect. The frame settings received from NSScreen.screens don't seem to match up with the Desktop arrangement settings in the Settings.
Apologies in advance for this long post!
for screen in NSScreen.screens {
let name = screen.localizedName
Globals.logger.debug("Globals initializeScreens - screen \(i) '\(name, privacy: .public)'")
Globals.logger.debug("Globals initializeScreens - '\(screen.debugDescription, privacy: .public)'")
}
This is what I receive in the log:
Globals initializeScreens - '<NSScreen: 0x600000ef4240;
name="PHL 346E2C";
backingScaleFactor=1.000000;
frame={{0, 0}, {3440, 1440}};
visibleFrame={{0, 0}, {3440, 1415}}>'
Globals initializeScreens - screen 2 'Blackmagic (1)'
Globals initializeScreens - '<NSScreen: 0x600000ef42a0;
name="Blackmagic (1)";
backingScaleFactor=1.000000;
frame={{-3840, 0}, {1920, 1080}};
visibleFrame={{-3840, 0}, {1920, 1055}}>'
Globals initializeScreens - screen 3 'Blackmagic (4)'
Globals initializeScreens - '<NSScreen: 0x600000ef4360;
name="Blackmagic (4)";
backingScaleFactor=1.000000;
frame={{-1920, 0}, {1920, 1080}};
visibleFrame={{-1920, 0}, {1920, 1055}}>'
Globals initializeScreens - screen 4 'Blackmagic (2)'
Globals initializeScreens - '<NSScreen: 0x600000ef43c0;
name="Blackmagic (2)";
backingScaleFactor=1.000000;
frame={{5360, 0}, {1920, 1080}};
visibleFrame={{5360, 0}, {1920, 1055}}>'
Globals initializeScreens - screen 5 'Blackmagic (3)'
Globals initializeScreens - '<NSScreen: 0x600000ef4420;
name="Blackmagic (3)";
backingScaleFactor=1.000000;
frame={{3440, 0}, {1920, 1080}};
visibleFrame={{3440, 0}, {1920, 1055}}>'
It looks like the frame settings for Blackmagic (2) and Blackmagic (4) are switched.
The setup has five monitors. Four are using the USB-C Digital AV Multiport Adapters. The output for these are streamed into a rack of A/V equipment using BlackMagic Design mini converters and monitors.
My Swift application allows users to open four movies, one for each of the AV Adapters. The movies can then be played back in sync for later processing by the A/V equipment.
Here are some screen captures that show my display settings.
Blackmagic (1) and Blackmagic (2) are to the left of the main screen.
Blackmagic (3) and Blackmagic(4) are to the right of the main screen.
The desktop is hard to see but is correct.
The wallpaper settings are all correct.
The wallpaper is correctly ordered when displayed on the monitors.
After opening the movies and using the NSScreen frame settings, the displays are incorrectly ordered. Test B and Test D are switched, which is what I would expect given the NSScreen frame values.
Any ideas? I've tried re-arranging the desktops, rebooting, etc. but no luck.
The code that changes the screen location is similar to this post on Stack Overflow
public func setDisplay( screen: NSScreen ) {
Globals.logger.log("MovieWindowController - setDisplay = \(screen.localizedName, privacy: .public)")
Globals.logger.debug("MovieWindowController - setDisplay - '\(screen.debugDescription, privacy: .public)'")
let dx = CGFloat(Constants.midX)
let dy = CGFloat(Constants.midY)
var pos = NSPoint()
pos.x = screen.visibleFrame.midX - dx
pos.y = screen.visibleFrame.midY - dy
Globals.logger.debug("MovieWindowController - setDisplay - x = '\(pos.x, privacy: .public)', y = '\(pos.y, privacy: .public)'")
window?.setFrameOrigin(pos)
}
The log show just what I would expect given the incorrect frame values.
MovieWindowController - setDisplay = Blackmagic (1)
MovieWindowController - setDisplay - '<NSScreen: 0x6000018e8420; name="Blackmagic (1)"; backingScaleFactor=1.000000; frame={{-3840, 0}, {1920, 1080}}; visibleFrame={{-3840, 0}, {1920, 1055}}>'
MovieWindowController - setDisplay - x = '-3840.000000', y = '-12.500000'
MovieWindowController - setDisplay = Blackmagic (2)
MovieWindowController - setDisplay - '<NSScreen: 0x6000018a10e0; name="Blackmagic (2)"; backingScaleFactor=1.000000; frame={{5360, 0}, {1920, 1080}}; visibleFrame={{5360, 0}, {1920, 1055}}>'
MovieWindowController - setDisplay - x = '5360.000000', y = '-12.500000'
MovieWindowController - setDisplay = Blackmagic (3)
MovieWindowController - setDisplay - '<NSScreen: 0x6000018cc8a0; name="Blackmagic (3)"; backingScaleFactor=1.000000; frame={{3440, 0}, {1920, 1080}}; visibleFrame={{3440, 0}, {1920, 1055}}>'
MovieWindowController - setDisplay - x = '3440.000000', y = '-12.500000'
MovieWindowController - setDisplay = Blackmagic (4)
MovieWindowController - setDisplay - '<NSScreen: 0x6000018c9ce0; name="Blackmagic (4)"; backingScaleFactor=1.000000; frame={{-1920, 0}, {1920, 1080}}; visibleFrame={{-1920, 0}, {1920, 1055}}>'
MovieWindowController - setDisplay - x = '-1920.000000', y = '-12.500000'
Am I correct? I think this is driving me crazy!
Thanks in advance!
Edit: The mouse behavior is correct in moving across the displays!
Topic:
Graphics & Games
SubTopic:
General
I am an AR developer working on Apple Silicon Macs. Currently, Reality Composer Pro does not allow exporting .reality files, and Reality Composer (classic) is not available for Apple Silicon. This creates a gap in the workflow for ARKit/RealityKit developers who need interactive .reality files for use in Xcode projects.
Having the ability to export .reality files directly from Reality Composer Pro on Mac would greatly streamline development and enable a fully native workflow on modern Macs. Alternatively, bringing Reality Composer (classic) to Apple Silicon would also resolve this issue.
I have submitted this as a feature request via Feedback Assistant (FB17900386). I encourage others with similar needs to reply or submit feedback as well.
Thank you!
Topic:
Graphics & Games
SubTopic:
RealityKit
Tags:
ARKit
Reality Composer
RealityKit
Reality Composer Pro
We have a released game in the App Store with Game Center Challenges. The challenges were working well in testing of the final version before the Game Center challenges went live. After release the activity that should take the player to the challenge in game results in this printout and we do not get informed by GKGameActivityListener of the activity happening in our code as we did before the Game Center challenges went live.
“Invalid game activity definition.
Failed to kick activity notification to GameKit. Error: Error Domain=GKErrorDomain Code=17 "(null)"”
Also, Game Center reports that the are no challenges in GKChallengeDefinition.all even though there are ongoing challenges.
The leaderboard results do get reported to Game Center though and show up as challenge entries in the Games app.
At the moment we are trying to figure out if this is a problem on our end or a Game Center server issue.
Topic:
Graphics & Games
SubTopic:
GameKit
Hi All!
I'm being asked to migrate an app which utilizes iCloud KVS (Key Value Storage). This ability is a new-ish feature, and the documentation about this is sparse [1]. Honestly, the entire documentation about the new iCloud transfer functionality seems to be missing. Same with Game Center / GameKit. While the docs say that it should work, I'd like to understand the process in more detail.
Has anyone migrated an iCloud KVS app? What happens after the transfer goes through, but before the first release? Do I need to do anything special? I see that the Entitlements file has the TeamID in the Key Value store - is that fine?
<key>com.apple.developer.ubiquity-kvstore-identifier</key>
<string>$(TeamIdentifierPrefix)$(CFBundleIdentifier)</string>
Can someone please share their experience?
Thank you!
[1] https://developer.apple.com/help/app-store-connect/transfer-an-app/overview-of-app-transfer
Hi. The earliest version of MacOS that Unity supports is 10.13. However, it seems that running a game using Unity Plugins on 10.13 causes DLL loading exceptions whenever you try to access part of the GameKit API. The errors look like this:
Fallback handler could not load library /Users/love/Desktop/REDACTED/Contents/Frameworks/MonoEmbedRuntime/osx/GameKitWrapper
Fallback handler could not load library /Users/love/Desktop/REDACTED/Contents/Frameworks/MonoEmbedRuntime/osx/libGameKitWrapper.dylib
Fallback handler could not load library /Users/love/Desktop/REDACTED/Contents/Frameworks/MonoEmbedRuntime/osx/libGameKitWrapper.dylib
Fallback handler could not load library /Users/love/Desktop/REDACTED/Contents/Frameworks/MonoEmbedRuntime/osx/libGameKitWrapper.so
Fallback handler could not load library /Users/love/Desktop/REDACTED/Contents/Frameworks/MonoEmbedRuntime/osx/libGameKitWrapper.bundle
DllNotFoundException: GameKitWrapper assembly:<unknown assembly> type:<unknown type> member:(null)
at (wrapper managed-to-native) Apple.GameKit.DefaultNSErrorHandler+Interop.DefaultNSErrorHandler_Set(Apple.Core.Runtime.NSExceptionCallback)
at Apple.GameKit.DefaultNSErrorHandler.Init () [0x00001] in ./Library/PackageCache/com.apple.unityplugin.gamekit@e3d4ad5a2c8e/Source/DefaultHandlers.cs:35
(Filename: ./Library/PackageCache/com.apple.unityplugin.gamekit@e3d4ad5a2c8e/Source/DefaultHandlers.cs Line: 35)
Fallback handler could not load library /Users/love/Desktop/REDACTED/Contents/Frameworks/MonoEmbedRuntime/osx/GameKitWrapper
Fallback handler could not load library /Users/love/Desktop/REDACTED/Contents/Frameworks/MonoEmbedRuntime/osx/libGameKitWrapper.dylib
Fallback handler could not load library /Users/love/Desktop/REDACTED/Contents/Frameworks/MonoEmbedRuntime/osx/libGameKitWrapper.dylib
Fallback handler could not load library /Users/love/Desktop/REDACTED/Contents/Frameworks/MonoEmbedRuntime/osx/libGameKitWrapper.so
Fallback handler could not load library /Users/love/Desktop/REDACTED/Contents/Frameworks/MonoEmbedRuntime/osx/libGameKitWrapper.bundle
DllNotFoundException: GameKitWrapper assembly:<unknown assembly> type:<unknown type> member:(null)
at (wrapper managed-to-native) Apple.GameKit.DefaultNSExceptionHandler+Interop.DefaultNSExceptionHandler_Set(Apple.Core.Runtime.NSExceptionCallback)
at Apple.GameKit.DefaultNSExceptionHandler.Init () [0x00001] in ./Library/PackageCache/com.apple.unityplugin.gamekit@e3d4ad5a2c8e/Source/DefaultHandlers.cs:14
These errors do not appear on 10.15 or later, which is why I am assuming it's a problem with this particular version of MacOS. Have not been able to test 10.14 so not sure how it handles there.
So, here is my question - what is the earliest version of MacOS that the Apple Unity plugins support? It's not documented anywhere on the GitHub page.
// Love