Hello,
I've been trying to leverage instanced rendering in RealityKit on visionOS but have not had success.
RealityKit states this is supported:
https://developer.apple.com/documentation/realitykit/validating-usd-files
https://developer.apple.com/videos/play/wwdc2021/10075/?time=1373
https://developer.apple.com/videos/play/wwdc2023/10099/?time=772
RealityKit Trace metrics
Validating instancing is working:
To test I made a base visionOS app with immersive space and the entity replaced with my test usdz file. I've been using the RealityKit Trace profiling template in xcode instruments in the immersive space and volume closed. This gets consistent draw call results.
If I have a single sphere mesh with one material I get one draw call, but the number of draw calls grows linearly with mesh count no matter how my entity is configured.
What I've tried
Create a test scene in blender, export with instancing enabled
Create a test scene in Reality Composer Pro using references
Author usda files by hand based on the OpenUSD spec
Programatically create a MeshResource with Contents at runtime
References
https://openusd.org/release/api/_usd__page__scenegraph_instancing.html
https://developer.apple.com/documentation/realitykit/meshresource
https://developer.apple.com/documentation/realitykit/meshresource/instance
Thank you
Delve into the world of graphics and game development. Discuss creating stunning visuals, optimizing game mechanics, and share resources for game developers.
Selecting any option will automatically load the page
Post
Replies
Boosts
Views
Activity
Hi everyone
I'm experiencing an issue with iPadOS 26 regarding multi-touch gesture detection. When performing a quick four-finger gesture (tap and swipe), the system often fails to recognize the input. This especially affects multi-touch gestures, such as rhythm games with difficult levels.
Steps to Reproduce:
Place four fingers on the screen.
Perform a quick tap or a quick horizontal swipe (like the one used to switch apps).
Observe whether the gesture is ignored or detected inconsistently.
Expected Behavior:
4-finger multitouch gestures should be recognized regardless of gesture speed, just like previous iPadOS versions.
Actual Behavior:
Gestures fail to be detected when executed quickly—same gestures still work, and miss notes in rhythm games.
You can check out my posts on Twitter/x and Facebook: [https://x.com/kokona_fwa/status/1978131164104728949?s=61]
Facebook: [https://m.facebook.com/groups/idipad/permalink/24438964899058806/?]
It's a Broadcast Extension issue: on iOS 26.1 beta the extension never launches—after you tap “Start Broadcast” in the system picker the countdown disappears after 3 s and no broadcast starts, so every live-streaming app(and all other non-system apps that use Broadcast Extension) fails to go live (only the native Photos screen recording still works). Is this a known regression or is a new entitlement required?
I have a question I guess more for the Apple team.
But why are there no totally 3D experiences for the Vision Pro lineup?
I know they have given us tools to implement unity 3D games into iPhone and I guess you can also build it in RealityKit. But why at this moment are 3D games limited to just iPad and iPhone and can't you bring that into Vision Pro?
Just to explain. When I say a totally 3D game, I mean games like Gorn. I mean the Vision Pro is definitely powerful enough, but it just feels limited to tabletop games and AR games.
Is this something Apple is thinking about implementing?
Topic:
Graphics & Games
SubTopic:
RealityKit
Tags:
ARKit
Reality Composer
RealityKit
Reality Composer Pro
What is the current [most recent] best practice to instancing Meshes in RealityKit?
I see both MeshInstanceComponent and MeshInstanceCollection.
My intent is to bind a transform to a Circle Agent (GameplayKit Agent), and feed that result to Instancing.
Hello!
Bare with me here, as there is a lot to explain!
I am working on implementing a Game Center high score leaderboard into my game. I have looked around for examples of how to properly implement this code, but have come up short on finding much material. Therefore, I have tried implementing it myself based off information I found on apples documentation.
Long story short, I am getting success printed when I update my score, but no scores are actually being posted (or at-least no scores are showing up on the Game Center leaderboard when opened).
Before I show the code, one thing I have questioned is the fact that this game is still in development. In AppStoreConnect, the status of the leaderboard is "Not Live". Does this affect scores being posted?
Onto the code. I have created a GameCenter class which handles getting the leaderboards and posting scores to a specific leaderboard. I will post the code in whole, and will discuss below what is happening.
PLEASE VIEW ATTACHED TEXT TO SEE THE GAMECENTER CLASS!
GameCenter class - https://developer.apple.com/forums/content/attachment/0dd6dca8-8131-44c8-b928-77b3578bd970
In a different GameScene, once the game is over, I request to post a new high score to Game Center with this line of code:
GameCenter.shared.submitScore(id: GameCenterLeaderboards.HighScore.rawValue)
Now onto the logic of my code. For the longest time I struggled to figure out how to submit a score. I figured out that in Xcode 12, they deprecated a lot of functions that previously worked for me. Not is seems that we have to load all leaderboards (or the ones we want). That is the purpose behind the leaderboards private variable in the Game Center class.
On the start up of the app, I call authenticate player. Once this callback is reached, I call loadLeaderboards which will load the leaderboards for each string id in an enum that I have elsewhere. Each of these leaderboards will be created as a Leaderboard object, and saved in the private leaderboard array. This is so I have access to these leaderboards later when I want to submit a score.
Once the game is over, I am calling submitScore with the leaderboard id I want to post to. Right now, I only have a high score, but in the future I may add a parameter to this with the value so it works for other leaderboards as well. Therefore, no value is passed in since I am pulling from local storage which holds the high score.
submitScore will get the leaderboard from the private leaderboard array that has the same id as the one passed in. Once I get the correct leaderboard, I submit a score to that leaderboard. Once the callback is hit, I receive the output "Successfully submitted score to leaderboard". This looks promising, except for the fact that no score is actually posted.
At startup, I am calling updatePlayerHighScore, which is not complete - but for the purpose of my point, retrieves the high score of the player from the leaderboard and is printing it out to the console. It is printing out (0), meaning that no score was posted.
The last thing I have questions about is the context when submitting a score. According to the documentation, this seems to just be metadata that GameCenter does not care about, but rather something the developer can use. Therefore, I think I can cross this off as causing the problem.
I believe I implemented this correctly, but for some reason, nothing is posting to the leaderboard. This was ALOT, but I wanted to make sure I got all my thoughts down.
Any help on why this is NOT posting would be awesome! Thanks so much!
Mark
Hi!
Using ARView in UIKit or through a UIViewRepresentable in SwiftUI, we can do:
arView.debugOptions = [.showPhysics, .showStatistics]
What is the equivalent in RealityView?
Topic:
Graphics & Games
SubTopic:
RealityKit
subj
And how in this case are beautiful system dials made with smoke effects and other particles?
Topic:
Graphics & Games
SubTopic:
Metal
Hey there, I’m currently planning to use RealityKit in a new multiplatform app I’m building. Unfortunately, I noticed that WatchOS is not supported for RealityKit, while SceneKit is getting deprecated. However, I’d like to maintain the same codebase across platforms. What are my options?
我们想在游戏类 App 内接入 Game Center。用户可以在游戏内创建多个角色,若用户在游戏内创建了2个角色:角色1、角色2,请问:
当用户将角色1与 Game Center 绑定后,数据将上报至 Game Center。此时玩家想要将角色1与 Game Center 解除绑定,解绑后,再将角色2与 Game Center 绑定。那么这时角色1的数据是留存在 Game Center 中,还是将被移除?
Hi team, I'm looking for the RealityKit debugger in Xcode 26 beta 3. I'm running a RealityKit app on my iPad running iPadOS 26 b3, but the debugger option is not there in Xcode.
hello i am new to apple ecosystem and development i have some coding experience with c# now i like to develop my game for iphone 16 and up(due to ability to run ai models) but i am having hard time figuring out what to use there is a lot of resources for scene-kit but on its doc page it says its deprecated so i look at the reality-kit docs and tutorials and its strictly tells how to develop for visionos and i am really confused about this since there is no tutorials that shows how to develop a game for ios with reality-kit that does not focus visionos. i just want to develop for iphone 16 and up but i cant find resources focuses at that.
Topic:
Graphics & Games
SubTopic:
RealityKit
GKAccessPoint triggerAccessPointWithState handler not invoked on iOS 26.0 and iOS 15.8.4
Incorrect/Unexpected Behaviour:
When calling [GKAccessPoint.shared triggerAccessPointWithState:GKGameCenterViewControllerStateAchievements handler:^{}] on a real device running iOS 26 beta (iOS 26), the overlay appears as expected, but the handler block is never called. This behavior also not working correctly on previous iOS versions(tested on iOS 15.8.4)
Steps to Reproduce:
Authenticate GKLocalPlayer
Call triggerAccessPointWithState:handler: with a block that logs or performs logic
Observe that overlay appears, but block is not executed
Behavior:
UI appears correctly
Handler is not invoked at all
Expected Result:
The handler should fire immediately after the dashboard is shown.
Actual Result:
The handler is never called.
Usecase:
As GKGameCenterViewController is deprecated we are moving to GKAccesspoint but due to above functionality issue we are unable to.
Environment:
Device: iPhone 16, iPhone 7
iOS: 26.0 and iOS 15.8.4
Xcode: 26.0 beta and Xcode 16.4
Hi all,
I've developed some code that enables an arcball camera interaction with my scene. I've done this using components and systems. The implementation feels a bit messy as I've got gesture code on my realityView, and then a bunch of other code that uses those gesture inputs in my component and system.
Is there a demo app, or some example code that shows a nice way to encapsulate these things in to one item for custom cameras, something like Apple's .realityViewCameraControls(.orbit)
If not can anyone recommend an approach to take?
My IOS app generates pdf files.
Every time my users open the generated pdf files, the autofill popup jumps out, but my pdf file is NOT for interacting.
I'm here to ask if there's a way to mark my pdf files as "not a form", like in metadata or anywhere else?
I am trying to learn the new Metal Peformance Primitives APIs. I have added the MetalPeformancePrimitives framework and included the header in my shader code as per documentation
#include <MetalPeformancePrimitives/MetalPeformancePrimitives.h>
Unfortunately, Xcode complains that the header cannot be found. How do I include it properly?
I am using Xcode 26 on Tahoe. The MetalPeformancePrimitives framework is present on my machine and I can inspect the headers in the filesystem.
Topic:
Graphics & Games
SubTopic:
Metal
Hello everyone and thank you for you're time!
I got an issue with uploading several icons to testflight. I'm using Game Maker Studio as my engine for the game.
This is the error that I'm getting, even when I try to use the old icons for the game, that worked in the past. I tried to transform the icons, using this site "https://makeappicon.com/", but I still got the same validation error. Can you help me fixing the issue - thank you so much!
Hello Apple team,
I'm working on an iOS AR app using SwiftUI and RealityKit,
and I was wondering if the Cinematic API can be used with a RealityKit scene. I’d like to achieve a shallow depth of field while keeping the 3D asset in focus, and vice versa.
Thanks!
Hello, I am quite new to using the metal API and was wondering if it was common (or even possible) if you knew that, when a pipeline was created, you never needed to make another one with the same shaders again, if it is safe to release the library the was used to reference the shaders? Only asking because this is possible in other apis, but apple never mentions (as far as I have found) if this is safe or not safe to do.
Topic:
Graphics & Games
SubTopic:
Metal
After updating to MacOS 26.1 I encountered an issue that Roblox tends to freeze quite often for 10 - 60 seconds at most, this is really annoying that it is doing this as i play the game a lot. My theory is that it is like a driver issue with metal or something, I have reinstalled MacOS, reinstalled the game and lowed the performance manually but nothing is working.
Wondering if you could help, when it will be fixed and if others are having the same issue.
Many thanks, William.