Hello RealityKit developers,
I'm currently working on physics simulations in my visionOS app and am trying to adapt the concepts from the official sample Simulating physics joints in your RealityKit app.
In the sample, a sphere is connected to the ceiling using a PhysicsRevoluteJoint to create a hinge-like simulation. I've successfully modified this setup to use a PhysicsSphericalJoint instead.
The basic replacement works as expected: pin1 (attached to the sphere) rotates freely around pin0 (attached to the ceiling), much like a ball-and-socket joint should, removing all translational degrees of freedom.
My challenge lies with the PhysicsSphericalJoint's angularLimitInYZ property. The documentation mentions that this property allows limiting the rotation around the Y and Z axes, defining an "elliptical cone shape around the x-axis of pin0." However, I'm struggling to understand how to specify these values to achieve a desired rotational limit.
If I have a sphere that is currently capable of rotating 360 degrees around pin0 (like a free-spinning ball on a string), how would I use angularLimitInYZ to restrict its rotation to a certain height or angular range, preventing it from completing a full circle?
Specifically, I'm trying to achieve a "swing" like behavior where the sphere oscillates back and forth but cannot rotate completely overhead or underfoot. What values or approach should I use for the angularLimitInYZ tuple to define such a restricted pendulum-like motion?
Any insights, code examples, or explanations on how to properly configure angularLimitInYZ for this kind of behavior would be incredibly helpful!
The following code is modified from the sample.
extension MainView {
func addPinsTo(ballEntity: Entity, attachmentEntity: Entity) throws {
let hingeOrientation = simd_quatf(from: [1, 0, 0], to: [0, 0, 1])
let attachmentPin = attachmentEntity.pins.set(
named: "attachment_hinge",
position: .zero,
orientation: hingeOrientation
)
let relativeJointLocation = attachmentEntity.position(
relativeTo: ballEntity
)
let ballPin = ballEntity.pins.set(
named: "ball_hinge",
position: relativeJointLocation,
orientation: hingeOrientation
)
// Create a PhysicsSphericalJoint between the two pins.
let revoluteJoint = PhysicsSphericalJoint(pin0: attachmentPin, pin1: ballPin)
try revoluteJoint.addToSimulation()
}
}
The following image is a screenshot of the operation when changing to PhysicsSphericalJoint.
Thank you in advance for your assistance.
Delve into the world of graphics and game development. Discuss creating stunning visuals, optimizing game mechanics, and share resources for game developers.
Selecting any option will automatically load the page
Post
Replies
Boosts
Views
Activity
TL;DR: RealityKit and Reality Composer Pro aren't forward or backward compatible with each other, and the resulting error message is terse and unhelpful. (FB14828873)
So far, I've been sticking with Xcode 16.4 for development and only using Xcode 26.0 beta experimentally.
Yesterday, I used xcode-select to switch to Xcode 26.0 beta 3 to test it, but I forgot to switch back.
Consequently, this morning I unintentionally used the future Reality Composer Pro (the version included with Xcode 26) to make a small change to a USD file.
Now I realize that if I'm unlucky, it's possible Reality Composer Pro may have silently introduced a small change into the USD file that may make RealityKit fail to read the file on iOS 18 and visionOS 2, which in the past has resulted in hours of debugging to track down the source of the failure, often a single line in the USD file that RealityKit can't communicate to me other than with the error "the operation couldn't be completed".
As an analogy, this situation is as if, during regular development (not involving Reality Composer Pro), Xcode didn't warn you about specific API version conflicts, but instead failed with a generic error message, without highlighting the line in your Swift file that was the source of the error.
I am using Unreal Engine 5.6 on a MacBook Pro with an M3 chip and macOS 15.5. I’ve installed Xcode and accepted the license, but Unreal is not detecting the latest Metal Shader Standard (Metal v3.0). The maximum version Unreal sees is Metal v2.4, even though the hardware and OS should support Metal 3.0. I’ve also run sudo xcode-select -s /Applications/Xcode.app and accepted the license via Terminal. Is there anything in Xcode settings, SDK availability, or system permissions that could be preventing access to Metal 3.0 features?"
Hi ,
My application meet below crash backtrace at very low repro rate from the public users, i do not see it relate to a specific iOS version or iPhone model. The last code line from my application is calling CAMetalLayer nextDrawable API.
I did some basic studying, suppose it may relate to the wrong CAMetaLayer configuration, like
frame property w or h <= 0.0
bounds property w or h <= 0.0
drawableSize w or h <= 0.0 or w or h > max value (like 16384)
Not sure my above thinking is right or not? Will the UIView which my CAMetaLayer attached will cause such nextDrawable crash or not ?
Thanks a lot
Main Thread - Crashed
libsystem_kernel.dylib
__pthread_kill
libsystem_c.dylib
abort
libsystem_c.dylib
__assert_rtn
Metal
MTLReportFailure.cold.1
Metal
MTLReportFailure
Metal
_MTLMessageContextEnd
Metal
-[MTLTextureDescriptorInternal validateWithDevice:]
AGXMetalA13
0x245b1a000 + 4522096
QuartzCore
allocate_drawable_texture(id<MTLDevice>, __IOSurface*, unsigned int, unsigned int, MTLPixelFormat, unsigned long long, CAMetalLayerRotation, bool, NSString*, unsigned long)
QuartzCore
get_unused_drawable(_CAMetalLayerPrivate*, CAMetalLayerRotation, bool, bool)
QuartzCore
CAMetalLayerPrivateNextDrawableLocked(CAMetalLayer*, CAMetalDrawable**, unsigned long*)
QuartzCore
-[CAMetalLayer nextDrawable]
SpaceApp
-[MetalRender renderFrame:] MetalRenderer.mm:167
SpaceApp
-[FrameBuffer acceptFrame:] VideoRender.mm:173
QuartzCore
CA::Display::DisplayLinkItem::dispatch_(CA::SignPost::Interval<(CA::SignPost::CAEventCode)835322056>&)
QuartzCore
CA::Display::DisplayLink::dispatch_items(unsigned long long, unsigned long long, unsigned long long)
QuartzCore
CA::Display::DisplayLink::dispatch_deferred_display_links(unsigned int)
UIKitCore
_UIUpdateSequenceRun
UIKitCore
schedulerStepScheduledMainSection
UIKitCore
runloopSourceCallback
CoreFoundation
__CFRUNLOOP_IS_CALLING_OUT_TO_A_SOURCE0_PERFORM_FUNCTION__
CoreFoundation
__CFRunLoopDoSource0
CoreFoundation
__CFRunLoopDoSources0
CoreFoundation
__CFRunLoopRun
CoreFoundation
CFRunLoopRunSpecific
GraphicsServices
GSEventRunModal
UIKitCore
-[UIApplication _run]
UIKitCore
UIApplicationMain
What is Game Mode?
Game Mode optimizes your gaming experience by giving your game the highest priority access to your CPU and GPU, lowering usage for background tasks. And it doubles the Bluetooth sampling rate, which reduces input latency and audio latency for wireless accessories like game controllers and AirPods.
See Use Game Mode on Mac
See Port advanced games to Apple platforms
How can I enable Game Mode in my game?
Add the Supports Game Mode property (GCSupportsGameMode) to your game’s Info.plist and set to true
Correctly identify your game’s Application Category with LSApplicationCategoryType (also Info.plist)
Note:
Enabling Game Mode makes your game eligible but is not a guarantee; the OS decides if it is ok to enable Game Mode at runtime
An app that enables Game Mode but isn’t a game will be rejected by App Review.
How can I disable Game Mode?
Set GCSupportsGameMode to false.
Note: On Mac Game Mode is automatically disabled if the game isn’t running full screen.
Hello,
Thank you for attending today’s Metal & game technologies group lab at WWDC25!
We were delighted to answer many questions from developers and energized by the community engagement.
We hope you enjoyed it and welcome your feedback.
We invite you to carry on the conversation here, particularly if your question appeared in Slido and we were unable to answer it during the lab.
If your question received feedback let us know if you need clarification.
You may want to ask your question again in a different lab e.g. visionOS tomorrow.
(We realize that this can be confusing when frameworks interoperate)
We have a lot to learn from each other so let’s get to Q&A and make the best of WWDC25! 😃
Looking forward to your questions posted in new threads.
hi everyone,
我们发现了一个和Metal相关崩溃。应用中使用了Metal相关的接口,在进行性能测试时,打开了设置-开发者-显示HUD图形。运行应用后,正常展示HUD,但应用很快发生了崩溃,日志主要信息如下:
Incident Identifier: 1F093635-2DB8-4B29-9DA5-488A6609277B
CrashReporter Key: 233e54398e2a0266d95265cfb96c5a89eb3403fd
Hardware Model: iPhone14,3
Process: waimai [16584]
Path: /private/var/containers/Bundle/Application/CCCFC0AE-EFB8-4BD8-B674-ED089B776221/waimai.app/waimai
Identifier:
Version: 61488 (8.53.0)
Code Type: ARM-64
Parent Process: ? [1]
Date/Time: 2025-06-12 14:41:45.296 +0800
OS Version: iOS 18.0 (22A3354)
Report Version: 104
Monitor Type: Mach Exception
Exception Type: EXC_BAD_ACCESS (SIGBUS)
Exception Codes: KERN_PROTECTION_FAILURE at 0x000000014fffae00
Crashed Thread: 57
Thread 57 Crashed:
0 libMTLHud.dylib esfm_GenerateTriangesForString + 408
1 libMTLHud.dylib esfm_GenerateTriangesForString + 92
2 libMTLHud.dylib Renderer::DrawText(char const*, int, unsigned int) + 204
3 libMTLHud.dylib Overlay::onPresent(id<CAMetalDrawable>) + 1656
4 libMTLHud.dylib CAMetalDrawable_present(void (*)(), objc_object*, objc_selector*) + 72
5 libMTLHud.dylib invocation function for block in void replaceMethod<void>(objc_class*, objc_selector*, void (*)(void (*)(), objc_object*, objc_selector*)) + 56
6 Metal __45-[_MTLCommandBuffer presentDrawable:options:]_block_invoke + 104
7 Metal MTLDispatchListApply + 52
8 Metal -[_MTLCommandBuffer didScheduleWithStartTime:endTime:error:] + 312
9 IOGPU IOGPUNotificationQueueDispatchAvailableCompletionNotifications + 136
10 IOGPU __IOGPUNotificationQueueSetDispatchQueue_block_invoke + 64
11 libdispatch.dylib _dispatch_client_callout4 + 20
12 libdispatch.dylib _dispatch_mach_msg_invoke + 464
13 libdispatch.dylib _dispatch_lane_serial_drain + 368
14 libdispatch.dylib _dispatch_mach_invoke + 456
15 libdispatch.dylib _dispatch_lane_serial_drain + 368
16 libdispatch.dylib _dispatch_lane_invoke + 432
17 libdispatch.dylib _dispatch_lane_serial_drain + 368
18 libdispatch.dylib _dispatch_lane_invoke + 380
19 libdispatch.dylib _dispatch_root_queue_drain_deferred_wlh + 288
20 libdispatch.dylib _dispatch_workloop_worker_thread + 540
21 libsystem_pthread.dylib _pthread_wqthread + 288
我们测试了几个不同的机型,只有iPhone 13 Pro Max会发生崩溃。
Q1:为什么会发生这个崩溃?
Q2:相同的逻辑,为什么仅在iPhone 13 Pro Max机型上出现崩溃?
期待您的解答。
Description:
In the official visionOS 26 Hover Effect sample code project , I encountered an issue where the event.trackingAreaIdentifier returned by onSpatialEvent does not reset as expected.
Steps to Reproduce:
Select an object with trackingAreaID = 6 in the sample app.
Look at a blank space (outside any tracking area) and perform a pinch gesture .
Expected Behavior:
The event.trackingAreaIdentifier should return 0 when interacting with a non-tracking area.
Actual Behavior:
The event.trackingAreaIdentifier still returns 6, even after restarting the app or killing the process. This persists regardless of where the pinch gesture is performed
Hello,
I'm getting this error when launching a SpriteKit Swift game in iOS 18+ on an iPhone 11 Pro, whose shell is partly damaged in the back:
CHHapticEngine.mm:1206 -[CHHapticEngine doStartWithCompletionHandler:]_block_invoke: ERROR: Player start failed: The operation couldn’t be completed. (com.apple.CoreHaptics error 1852797029.)
Haptics do not work on this device, due to the damaged shell, so some error — which obviously occurs when calling start(completionHandler:) — is definitely expected; what is not expected is the main thread sometimes blocking for up to 5 seconds — although the method is not called from the main thread... the error itself is always displayed from some other secondary (system) thread. During this time, the main thread does not access the haptics engine at all; on average, it blocks once every four or five launches. In each launch (blocking or not), the 'nope' error is displayed ~5 seconds after trying to start the engine.
After going nuts with all kinds of breakpoints and instrumentation, I'm at a loss as to why the main thread would sometimes block...
Ideas, anyone?
Thank you,
D.
I want to use reality to create a custom material that can use my own shader and support Mesh instancing (for rendering 3D Gaussian splating), but I found that CustomMaterial does not support VisionOS. Is there any other interface that can achieve my needs? Where can I find examples?
Topic:
Graphics & Games
SubTopic:
RealityKit
When I take a frame capture of my application in Xcode, it shows a warning that reads "Your application created separate command encoders which can be combined into a single encoder. By combining these encoders you may reduce your application's load/store bandwidth usage."
In the minimal reproduction case I've identified for this warning, I have two render pipeline states: The first writes to the current drawable, the depth buffer, and a secondary color buffer. The second writes only to the current drawable.
Because these are writing to a different set of outputs, I was initially creating two separate render command encoders to handle the draws under each of these states.
My understanding is that Xcode is telling me I could only create one, however when I try to do that, I get runtime asserts when attempting to apply the second render pipeline state since it doesn't have a matching attachment configured for the second color buffer or for the depth buffer, so I can't just combine the encoders.
Is the only solution here to detect and propagate forward the color/depth attachments from the first state into the creation of the second state?
Is there any way to suppress this specific warning in Xcode?
Topic:
Graphics & Games
SubTopic:
Metal
Does anyone have a working example on how to play OGG files with swift?
I've been trying for over a year now. I was able to wrap the C Vorbis library in swift. I then used it to parse an OGG file successfully. Then I was required to use Obj-C\++ to fill the PCM because this method seems to only be available in C\++ and that part hangs my app for a good 40 seconds to several minutes depending on the audio file, it then plays for about 2 seconds and then crashes.
I can't get the examples on the Vorbis site to work in objective-c and i tried every example on github I could find (most of which are for iOS - I want to play the files on mac)
I also tried using Cricket Audio framework below.
https://github.com/sjmerel/ck
It has a swift example and it can play their proprietary soundbank format but it is also supposed to play OGG and it just doesn't do anything when trying to play OGG as you can see in the posted issue
https://github.com/sjmerel/ck/issues/3
Right now I believe every player that can play OGGs on mac is written in Objective-C or C++.
Anyway, any help/advice is appreciated. OGG format is very prevalent in the gaming community. I could use unity, which I believe plays oggs through the mono framework but I really really want to stay in swift.
The sample code here, has code like:
// Create a display link capable of being used with all active displays
cvReturn = CVDisplayLinkCreateWithActiveCGDisplays(&_displayLink);
But that function's doc says it's deprecated and to use NSView/NSWindow/NSScreen displayLink instead. That returns CADisplayLink, not CVDisplayLink.
Also the documentation for that displayLink method is completely empty. I'm not sure if I'm supposed to add it to run loop, or what, after I get it.
It would be nice to get an updated version of this sample project and/or have some documentation in NSView.displayLink
Topic:
Graphics & Games
SubTopic:
Metal
Hello
I would like to know how to combine 2 animations with RealityKit (one animation for the arms and one for the legs for example)
I saw this apple demo that seems to explain it but I don't understand at all how to do it...
Thanks
I want to use reality to create a custom material that can use my own shader and support Mesh instancing (for rendering 3D Gaussian splating), but I found that CustomMaterial does not support VisionOS. Is there any other interface that can achieve my needs? Where can I find examples?
Topic:
Graphics & Games
SubTopic:
RealityKit
Im new in the Mac area but for sure not UE. Windows is a long process to packaging but it could be done. All the documentation for Epic and from the internet is basically non existent with exactly how to package a project within UE. I have Xcode installed which makes sense, agreed to terms and install for MacOS, I've been able to make a project for several weeks now and want to package for a test run for my friends to play on Windows. Now I just get this in the log:
UATHelper: Packaging (Mac): ERROR: Failed to finalize the .app with Xcode. Check the log for more information
UATHelper: Packaging (Mac): Trace written to file /Users/rileysleger/Library/Logs/Unreal Engine/LocalBuildLogs/UBA-ProjectNightTerror-Mac-Development.uba with size 12.6kb
UATHelper: Packaging (Mac): Total time in Unreal Build Accelerator local executor: 8.12 seconds
UATHelper: Packaging (Mac): Result: Failed (OtherCompilationError)
UATHelper: Packaging (Mac): Total execution time: 9.71 seconds
PackagingResults: Error: Failed to finalize the .app with Xcode. Check the log for more information
UATHelper: Packaging (Mac): Took 9.77s to run dotnet, ExitCode=6
UATHelper: Packaging (Mac): UnrealBuildTool failed. See log for more details. (/Users/rileysleger/Library/Logs/Unreal Engine/LocalBuildLogs/UBA-ProjectNightTerror-Mac-Development.txt)
UATHelper: Packaging (Mac): AutomationTool executed for 0h 0m 10s
UATHelper: Packaging (Mac): AutomationTool exiting with ExitCode=6 (6)
UATHelper: Packaging (Mac): RunUAT ERROR: AutomationTool was unable to run successfully. Exited with code: 6
PackagingResults: Error: AutomationTool was unable to run successfully. Exited with code: 6
PackagingResults: Error: Unknown Error
This absolutely makes no sense to me. Anyone have ideas?
Hi, following the recent deprecation of SceneKit, I'm trying to move a couple of my SceneKit projects to RealityKit.
One thing I can't seem to find is how to change the content scale factor when using a RealityView in SwiftUI. It was really easy to do in SceneKit with just a SCNView property, and it seems that it's also possible when using ARView, but I can't find a way to do it with a RealityView. Maybe it's a SwiftUI limitation?
Topic:
Graphics & Games
SubTopic:
RealityKit
Is there any limitation in Vision Pro when loading scenes with large-scale models?
Test Case:
Asset: Composite USDA file containing 10 individual models (total triangles count: ~4.2M)
Simulator: Loads and renders correctly
Real Device:
Loads asset successfully but failure during rendering phase:
Environment abruptly dims
System spontaneously reboots
How can we resolve this issue?
Below are excerpted logs preceding the crash:
<<<< FigAudioSession(AV) >>>> audioSessionAVAudioSession_CopyMXSessionProperty signalled err=-19224 (kFigAudioSessionError_UnsupportedOperation) (getMXSessionProperty unsupported) at FigAudioSession_AVAudioSession.m:606
Attempted to add ornament: <MRUIPlatterOrnament: 0x10a658f00; _isInternal: YES; _displaceWindowChrome: NO; _canCaptureUI: NO; _isBeingRemoved: NO; contentAnchorPoint3D: "{0.5, 0.5, 0}"; position: <MRUIPlatterOrnamentRelativePosition: 0x105b68e70; anchorPoint: {0.5, 0.5, 1}>; rotation: "{{0, 0, 0}, 0}"; opacity: 1.000000; canFollowUser: YES; effectiveOffset: "{0, 0, 0}"; presentingViewController: 0x0; billboardingBehavior: 0x0; scalingBehavior: 0x0; relativeToParent: NO; nonHeritableDepthDisplacement: 0.000000; order: 0.000000; _window._determinedSize: {0, 0}; _window: (null)> to nil or non-supporting UIScene: <UIWindowScene: 0x10a8a0000; role: UISceneSessionRoleImmersiveSpaceApplication; persistentIdentifier: test.test:SFBSystemService-BA3A21A3-D1AB-42E2-8AF0-AE0AB83BE528; activationState: UISceneActivationStateUnattached>. No action taken.
Failed to set dependencies on asset 2823930584475958382 because NetworkAssetManager does not have an asset entity for that id.
apply fence tx failed (client=0x98490e18) [0x10000003 (ipc/send) invalid destination port]
Failed to commit transaction (client=0xa86516e2) [0x10000003 (ipc/send) invalid destination port]
Hello experts, I'm trying to implement a material with custom shader code, but I saw that visionOS doesn't allow you to inject custom Metal functions or use CustomMaterial like iOS/macOS, nor can you directly write Metal Shading Language (.metal) and use it through ShaderGraphMaterial. So my question is, if i want to implement your own shader code, how should i do it?
I am an AR developer working on Apple Silicon Macs. Currently, Reality Composer Pro does not allow exporting .reality files, and Reality Composer (classic) is not available for Apple Silicon. This creates a gap in the workflow for ARKit/RealityKit developers who need interactive .reality files for use in Xcode projects.
Having the ability to export .reality files directly from Reality Composer Pro on Mac would greatly streamline development and enable a fully native workflow on modern Macs. Alternatively, bringing Reality Composer (classic) to Apple Silicon would also resolve this issue.
I have submitted this as a feature request via Feedback Assistant (FB17900386). I encourage others with similar needs to reply or submit feedback as well.
Thank you!
Topic:
Graphics & Games
SubTopic:
RealityKit
Tags:
ARKit
Reality Composer
RealityKit
Reality Composer Pro