I mean…I want to use defaults rather than launching apps via open with the saved environment variables.
This is pretty easy on iOS and other platforms. So what about in macOS?
Delve into the world of graphics and game development. Discuss creating stunning visuals, optimizing game mechanics, and share resources for game developers.
Selecting any option will automatically load the page
Post
Replies
Boosts
Views
Activity
Our APP has integrated 3D function, in order to reduce the memory occupation of the APP in the background, we will uninstall the 3D after the APP enters the background. However, the uninstall also causes problems. When the uninstall process is executed in the background, the app will briefly trigger the background GPU rendering error warning with the error warning code:
OGPUMetalError: Insufficient Permission (to submit GPU work from background) (00000006:kIOGPUCommandBufferCallbackErrorBackgroundExecutionNotPermitted)
Execution of the command buffer was aborted due to an error during execution. Insufficient Permission (to submit GPU Work from background) (00000006: kIOGPUCommandBufferCallbackErrorBackgroundExecutionNotPermitted) excuse me this warning system will tighten APP permissions background? For example, limit or shorten the background survival time of the APP. In addition, will the background refresh function fail, resulting in the failure of Bluetooth Ibeacon activation?
Topic:
Graphics & Games
SubTopic:
Metal
I am rewriting an unfinished SceneKit project as RealityKit (NonAR). As far as I can see, RealityKit is missing basic fog functionality?
Fog was simple & easy to implement in SCeneKit (fogStartDistance / fogEndDistance / fogDensityExponent / fogColor). Are there any plans to implement something like this in RealityKit?
Are there any simple workarounds?
Topic:
Graphics & Games
SubTopic:
RealityKit
I have 2 planes with textures on. I want these planes to intersect [ –|– ], and I want the blend mode to be additive. Currently I get z fighting on the planes, and I can't see how to set blend modes.
I've done this before in Unity and Godot in a fairly straight forward manner.
How do I accomplish this with RealityKit, preferably using code only (my scene is quite dynamic)?
Do I need to do it with a shader manually? How can I stop the z fighting?
Hi team, I'm looking for the RealityKit debugger in Xcode 26 beta 3. I'm running a RealityKit app on my iPad running iPadOS 26 b3, but the debugger option is not there in Xcode.
Hi — we’re testing our app on iOS 26 and ran into strange behavior with GKLocalPlayer.local.authenticateHandler.
GKLocalPlayer.local.authenticateHandler = { [weak self] viewController, error in
// additional code
}
What happens:
When we assign authenticateHandler on iOS 26 and the user is not signed in to Game Center, the system shows a full-screen Game Center overlay asking the user to sign in.
If the user taps Cancel, nothing further happens — the closure is not invoked again, so we don’t receive an error or any callback. The app never learns whether the auth was cancelled or failed.
In previous iOS versions the closure was called (with viewController / error as appropriate) and the flow worked as expected.
What we tried:
Verified authenticateHandler is being set.
Checked GKLocalPlayer.local.isAuthenticated after the overlay dismisses — it’s unchanged.
Observed system logs: a com.apple.GameOverlayUI scene is created and later removed (so the auth overlay is shown by the system).
Confirmed the same code works on earlier iOS versions. :thinking:
Question:
Has anyone seen authenticateHandler not being invoked on iOS 26 when the Game Center auth overlay is presented? Could this be a behavioral change in iOS 26 (overlay runs in a separate system process), or a bug? Any suggested workarounds to reliably detect that the user cancelled the sign-in (for example: listening for willResignActive / didBecomeActive, watching for a system overlay, or saving/presenting the viewController manually)?
Thanks in advance for any advice — we’d appreciate pointers or suggested diagnostics ?
iPhone(14 Pro Max)で端末の画面にリフレッシュレートを表示させたいのですが、どなたか方法をご存知ないでしょうか?
Topic:
Graphics & Games
SubTopic:
General
Hello,
We are experiencing an issue with apparent lag or latency when interacting with objects in Unity using the XR Interaction Toolkit. Even when interactables are configured with the Movement Type set to Instantaneous, objects exhibit a noticeable delay when following hand movement. This issue is particularly apparent when the hand moves at higher speeds.
We would like to know: is any additional configuration required, or is there something else we need to do to resolve this?
Thank you very much for your assistance.
I’m running into unexpected problems with the Unity GameKit plugin setup with the new Activities. I didn’t see anyone else mentioning these issues, so my guess is that it’s a problem with my setup.
The authentication (GKLocalPlayer.Authenticate()) works as expected, but any call to the new GameActivity functionality (e.g. GKGameActivityDefinition.LoadGameActivityDefinitions() or GKGameActivity.WantsToPlay) gives me an exception with a 'UnsupportedOperationForOSVersion' reason despite it running on an iOS 18.6.1 device.
I’m using a completely empty Unity 2022.3.62f1 project that only contains the official authentication example, followed by an event handler for activities from another example and the Unity core and gamekit plugins.
The setup:
macOS 15.6
Xcode 26 beta 6 (also tried with 5)
Physical iPhone device running iOS 18.6.1
Unity 2022.3.62f1, which satisfies the requirements
Unity plugin, Xcode setup, and build steps:
I Followed the official beta branch build steps for Unity plugins with python3 build.py -m iOS iPhoneSimulator macOS -p Core GameKit which ran through after a slight modification for the macOS target that somehow contained an unknown team reference for the GameKitWrapper project, which I changed to not reference a team and use ‘sign to run locally’, as was the case for the other packages. As far as I understood the macOS version is not strictly necessary either way just for running it on a local iOS device(?)
I Imported these as tarball packages into the empty Unity 2022.3.62f1 project as per the official instructions, which seems to work as expected
Added a single script with the mentioned example code added in a MonoBehavior.Start
Building in Unity works as expected as well, creating the Xcode project
The Unity-iPhone target has the GameKit framework linked (’do not embed’) and the GameCenter capability was added automatically as expected
The GameKit framework seems to not be added to the UnityFramework target, but I don’t think this is necessary? Quickly testing this with the GameKit framework added there as well didn’t make a difference
The linked GameKit framework is indeed the expected Xcode 26 beta version
I can then build and run this on the physical iPhone iOS 18.6.1 device, where I get an ‘UnsupportedOperationForOSVersion’ as soon as I try to subscribe to deeplinking events (GKGameActivity.WantsToPlay) or use other GameKit Activity functionality from the official examples:
// log showing that it's actually running on iOS 18.6:
[Apple.Core Plug-In Runtime] Availability Runtime Environment: iOS 18.6
Apple.Core.Availability:OnApplicationStart()
// and the exception I get:
GameKitException: Code=-7 Domain=GKErrorDomain Description=The operation couldn’t be completed. (GKErrorDomain error -7.) (UnsupportedOperationForOSVersion)
at Apple.GameKit.DefaultNSErrorHandler.ThrowNSError (System.IntPtr nsErrorPtr) [0x00000] in <00000000000000000000000000000000>:0
Rethrow as TypeInitializationException: The type initializer for 'Apple.GameKit.GKGameActivity' threw an exception.
I unfortunately didn’t find any clues as to why this happens and how to resolve it on this forum or otherwise.
Changing the minimum iOS version - up to 18.6 from the previously used (Unity export default) 12.0 for any and all targets - did not yield a different result
I'd rather not update the phone to use the iOS 26 beta, though as far as I understood this is not necessary
Any pointers to what I might be missing or doing wrong are greatly appreciated!
Thank you very much in advance!
On an iPad running iPadOS 26 beta 4, when tapping the Game Center Access Point, the overlay doesn’t show the configured achievements, leaderboards or challenges.
I should specify this is an in-development app and the achievements and leaderboards are in the “Not Live” state, however they show on other devices running iOS 18 in the Access Point UI.
Anyone else having this issue? If so, how should I test achievements and leaderboards while iOS 26 beta is out?
The UI looks like this on iPadOS 26:
In swipe-driven games, a first downward swipe starting near the home indicator can trigger Reachability, even when using preferredScreenEdgesDeferringSystemGestures = .bottom and prefersHomeIndicatorAutoHidden = true. This causes the app to slide down to half screen and breaks gameplay. Please consider an API to temporarily defer Reachability while a custom gesture is active (similar to existing system gesture deferral), without disabling accessibility globally.
Environment:
Devices: iPhones with Home Indicator (Face ID)
Why this matters:
Bottom-origin swipes are core in many games (flick shots, slingshot, physics toss, bottom sheets). Current workarounds degrade UX and discoverability, and players still accidentally trigger Reachability.
Feedback Assistant Post
Topic:
Graphics & Games
SubTopic:
General
Hello, when I'm looking to customize the icons of my phone, the applications that are in the grouping genres without replacing with all-black images, I don't know what happens by changing the color of the applications in group of change no color throws just listen not the black stuff
Topic:
Graphics & Games
SubTopic:
General
Hello,
I am experiencing an issue with programmatically capturing a GPU trace using MTLCaptureManager. The .gputrace file that is generated appears to be corrupted, and I'm looking for guidance or a solution.
Description of the Problem:
I am using MTLCaptureManager.sharedCaptureManager to capture a Metal frame and save it to disk.
The generated .gputrace file is consistently reported as 0 bytes in size by the file system.
Crucially, when I compress this 0-byte .gputrace file into a .zip archive, the resulting archive contains the full, expected data. After unzipping, the file can be opened and viewed correctly in Xcode.
However,When inspecting the file's contents using NSFileManager in Objective-C (treating it as a directory), the internal structure is different from a .gputrace file captured directly from Xcode's Metal Debugger.
capture in xcode
capture in file
Finally,When capturing multiple frames programmatically, the first captured frame contains valid buffer data. However, for subsequent frames (starting from the second frame), the corresponding buffer contents are all zero-filled.
Frame 1: All MTLBuffer data is correctly captured and populated.
Frame 2 and onward: The same MTLBuffer objects are present in the trace, but their contents are entirely 0 (i.e., the data is not captured or is corrupted).
In this case, the on-screen display is normal, but the captured frame is incorrect. The frame captured directly in Xcode is also correct. Only the frame captured to a file is abnormal.
Topic:
Graphics & Games
SubTopic:
Metal
In the CanyonCrosser example project, some RealityKit systems are implemented as classes while others are structs. What’s the reason for using different types?
I haven't been looking at screensavers for a long time because of Apple's lack of will (or resources?) to provide a public version of the private modern SDK used by Apple for a very long time now.
I'm now looking at the Screen Saver pane in System Settings (the What-If version of System Preferences in an alternate universe where all screens are in portrait mode).
In macOS Sequoia, it seems like 3rd party screensavers are not welcome considering that they are relegated to the "Other" section at the bottom of the list and you have to click Show All to start seeing 3rd party screen savers.
I also had a quick look at macOS Tahoe Beta 3 and it looks like that all the real screensavers are gone (3rd party and the ones from Apple: Hello, Message, Flurry, etc.) or at least it requires to be a Nobel Prize to find them (and the Search field is not useful).
I tried to install a 3rd party screen saver on macOS Tahoe Beta 3, it doesn't show up in the list.
To summarize:
No public access to modern APIs AFAIK.
UI that is hostile to 3rd party screen savers on macOS Sequoia.
Apparently only screensavers that are slideshows or movies curated by Apple in macOS Tahoe b3.
Hence the question:
Is there any future for screen savers on macOS?
Because if there's none, I won't waste my time trying to update some old screen savers.
Post can be removed.
Hello — I shipped an App Store build that signs in to Game Center using the Apple Unity Plugins (GameKit). The login banner appears, but my app still doesn’t show up in Game Center’s “All activity” (You started playing XXX 2d ago)
What I’ve done
Call await GKLocalPlayer.Authenticate();
“Game Center” is enabled for the current version in App Store Connect
Confirmed: other App Store games do appear under “All Activity” on the same device/account
Timeline: This is the first version that enables Game Center (not the app’s first release), and it has been about 2 hours since this build went live.
Questions
Is authentication alone sufficient for “Recently Played,” or is at least one Game Center component (leaderboards, achievements, activities, multiplayer) required?
Is there a typical propagation delay before “Recently Played” starts showing a newly enabled app/version?
Is there anything else I should configure in App Store Connect or entitlements to make “Recently Played” visible?
Thanks for any help.
So I get JPEG data in my app. Previously I was using the higher level NSBitmapImageRep API and just feeding the JPEG data to it.
But now I've noticed on Sonoma If I get a JPEG in the CMYK color space the NSBitmapImageRep renders mostly black and is corrupted. So I'm trying to drop down to the lower level APIs. Specifically I grab a CGImageRef and and trying to use the Accelerate API to convert it to another format (to hopefully workaround the issue...
CGImageRef sourceCGImage = `CGImageCreateWithJPEGDataProvider(jpegDataProvider,`
NULL,
shouldInterpolate,
kCGRenderingIntentDefault);
Now I use vImageConverter_CreateWithCGImageFormat... with the following values for source and destination formats:
Source format: (derived from sourceCGImage)
bitsPerComponent = 8
bitsPerPixel = 32
colorSpace = (kCGColorSpaceICCBased; kCGColorSpaceModelCMYK; Generic CMYK Profile)
bitmapInfo = kCGBitmapByteOrderDefault
version = 0
decode = 0x000060000147f780
renderingIntent = kCGRenderingIntentDefault
Destination format:
bitsPerComponent = 8
bitsPerPixel = 24
colorSpace = (DeviceRBG)
bitmapInfo = 8197
version = 0
decode = 0x0000000000000000
renderingIntent = kCGRenderingIntentDefault
But vImageConverter_CreateWithCGImageFormat fails with kvImageInvalidImageFormat. Now if I change the destination format to use 32 bitsPerpixel and use alpha in the bitmap info the vImageConverter_CreateWithCGImageFormat does not return an error but I get a black image just like NSBitmapImageRep
Apple, please bring back SceneKit.
Hi,
I'm rewriting my game from SceneKit to RealityKit, and I'm having trouble implementing the following scenario:
I tap on the iPhone screen to select an Entity that I want to drag.
If an Entity was tapped, it should then be possible to drag it left, right, etc.
SceneKit solution:
func CGPointToSCNVector3(_ view: SCNView, depth: Float, point: CGPoint) -> SCNVector3 {
let projectedOrigin = view.projectPoint(SCNVector3Make(0, 0, Float(depth)))
let locationWithz = SCNVector3Make(Float(point.x), Float(point.y), Float(projectedOrigin.z))
return view.unprojectPoint(locationWithz)
}
and then I was calling:
SCNView().hitTest(location, options: [SCNHitTestOption.firstFoundOnly:true])
the code was called inside of the UIPanGestureRecognizer in my UIViewController.
Could I reuse that code or should I go with the SwiftUI approach - something like that:
var body: some View {
RealityView {
....
} .gesture(TapGesture().onEnded {
})
?
I already have this code:
@State private var location: CGPoint?
.onTapGesture { location in
self.location = location
}
I'm trying to identify the entity that was tapped within the RealityView like that:
RealityView { content in
let box: ModelEntity = createBox() // for now there is only one box, however there will be many boxes
content.add(box)
let anchor = AnchorEntity(world: [0, 0, 0])
content.add(anchor)
_ = content.subscribe(to: SceneEvents.Update.self) { event in
//TODO: find tapped entity, so that it could be dragged inside of the DragGesture()
}
Any help would be appreciated.
I also noticed that if I create a TapGesture like that:
TapGesture(count: 1)
.targetedToAnyEntity()
and add it to my view using .gesture() then it is not triggered.