Discuss spatial computing on Apple platforms and how to design and build an entirely new universe of apps and games for Apple Vision Pro.

All subtopics
Posts under Spatial Computing topic

Post

Replies

Boosts

Views

Activity

Entity HoverEffect Fired When inside Another Entity Collider
Hi folks, I’m new to Vision Pro stack, still trying to learn all the nuances. Here is a problem I can’t seem to find an answer. I placed entity A( a small .02 radius sphere) inside entity B( size:.1 box). Both entities have HoverEffectComponent, and both inputcomponent is set to .direct. Entity A is NOT a child of Entity B. When I direct touch Entity B, I noticed that Entity A’s hover effect is fired as well. This only happens if Entity A‘s position is inside Entity B. The gesture that is only targeted at Entity A doesn’t work either. I double checked Entity A collider which sits inside entity B collider, my direct touch shouldn’t have trigger its hove effect. Having one collider inside another seems to produce unpredictable behavior? Thanks in advance 🙏🙏🙏 Context: I’m trying to create an invisible bound around Entity A, so when my hand approaches the bound to grab Entity A, a nice spotlight hover effect would fire first on the bound before hand reaching entity A.
2
0
349
Feb ’25
CompositorServices Or RealityKit
I have been concentrating on developing the visionOS application. While I am currently quite familiar with RealityKit, CompositorServices has also captured my attention. I have not yet acquired knowledge of CompositorServices. Could you please clarify whether it is essential for me to learn CompositorServices? Additionally, I would appreciate it if you could provide insights into the advantages of RealityKit and CompositorServices.
2
0
771
Mar ’25
Look to Scroll
Hello! I’m excited to see that Look to Scroll has been included in visionOS 26 Beta. I’m aiming to achieve a feature where the user’s gaze at a specific edge automatically scrolls to that position. However, I’ve experimented with ScrollView and haven’t been able to trigger this functionality. Could you advise if additional API modifiers are necessary? Thank you!
1
0
559
Jul ’25
Accessing pupil diameter in visionOS
Previously I had developed software using SMI eye trackers, both screen mounted and their mobile glasses, for unique therapeutic and physiology applications. Sadly, after SMI was bought by Apple, their hardware and software have been taken off the market and now it is very difficult to get secondhand-market systems. The Apple Vision Pro integrates the SMI hardware. While I can use ARKit to get gaze position, I do not see a way to access information that was previously made accessible on the SMI hardware, particularly: dwell time and pupil diameter information. I am hopeful (or asking) to see that if a user has a properly set up Optic ID and would opt-in if, either on the present or a future version of visionOS, it might be possible to get access to the data streams for dwell times and pupil diameter. Pupil diameter is particularly important as it is a very good physiological measure of how much stress a person is encountering, which is critical to some of the therapeutic applications that formerly we used SMI hardware. Any ideas, or, if this is not possible, proposing this to the visionOS team would be appreciated!
2
0
251
Jul ’25
ShaderGraphMaterial on entity
Hi I try to make a 360 stereo viewer, and I have made a ShaderGraphMaterial on Reality Composer Pro. Im trying to use that material on a inverted sphere whitch is generated in Swift. When I try to attach the material I get this error "Type of expression is ambiguous without a type annotation" Here is the code (sorry im noob =) ): import SwiftUI import RealityKit import RealityKitContent import PhotosUI struct ImmersiveView: View { @Environment(AppModel.self) var appModel var body: some View { RealityView { content in // Add the initial RealityKit content guard let skyBoxEntity = await createSkybox() else { return } content.add(skyBoxEntity) } } } private func createSkybox () async -> Entity? { var matX = try? await ShaderGraphMaterial(named: "/Root/Mat_Stereo360", from: "360Stereo.usda", in: realityKitContentBundle) let sphere = await MeshResource.generateSphere(radius:1000) let entity = await Entity() entity.components.set(ModelComponent(mesh: sphere, materials: [matX])). //ERROR HERE: Type of expression is ambiguous without a type annotation //entity.scale *= .init(x:-1, y:1, z:1) return entity } I hope someone can help me =) Best regards, Kim
2
0
587
Jan ’25
WorldAnchors added and removed immediately
I can add a WorldAnchor to a WorldTrackingProvider. The next time I start my app, the WorldAnchor is added back, and then is immediately removed: dataProviderStateChanged(dataProviders: [WorldTrackingProvider(0x0000000300bc8370, state: running)], newState: running, error: nil) AnchorUpdate(event: added, timestamp: 43025.248134708, anchor: WorldAnchor(id: C0A1AE95-F156-45F5-9030-895CAABF16E9, isTracked: true, originFromAnchorTransform: <translation=(0.048458 0.000108 -0.317565) rotation=(0.00° 15.44° -0.00°)>)) AnchorUpdate(event: removed, timestamp: 43025.348131208, anchor: WorldAnchor(id: C0A1AE95-F156-45F5-9030-895CAABF16E9, isTracked: false, originFromAnchorTransform: <translation=(0.000000 0.000000 0.000000) rotation=(-0.00° 0.00° 0.00°)>)) It always leaves me with zero anchors in .allAnchors...the ARKitSession is still active at this point
8
0
741
Jan ’25
Person Occlusion on the Vision Pro
I am currently creating an app where two people share an instance of an immersive space so that they are able to point to certain things in the immersive space. Right now, other people are hidden behind the immersive space, and even with people awareness enabled for everything, people are still too difficult to see. I've found this documentation (https://developer.apple.com/documentation/arkit/occluding-virtual-content-with-people) which describes what I want to do, but it is only listed as working on iOS an iPadOS. Is there anything similar to this that will work on VisionOS?
2
0
133
Mar ’25
Releasing a TextureObject from memory
Hi there! I´m trying to make a 360 image carousel in RealityView/SwiftUI with very large textures. I´ve managed to load one 12K 360 image and showing it on a inverted sphere with a ShaderMaterialGraph made in Reality Composer Pro. When I try to load the next image I get an out of memory error. The carousel works fine with smaller textures. My question is. How do I release the memory from the current texture before loading the next? In theory the garbagecollector should erase it eventually? Hope someone can help =) Thanks in advance! Best regards, Kim
6
0
708
Jan ’25
CameraFrameProvider distortion correction
Hi, I'm trying to correct the lens distortion in frames provided by Enterprise API camera frame provider. The frames provided seem to have only in/extrinsics info, but not the distortion lookup table. Is there some magic setting, or function to do that (I can't seem to find anything like this)? Or is there a way to use AVCameraCalibrationData together with provider?
2
0
414
Mar ’25
Index out of range crash: internal framework
Hi there, I'm developing a visionOS app that is using the anchor points and mesh from SceneReconstructionProvider anchor updates. I load an ImmersiveSpace using a RealityView and apply a ShaderGraphMaterial (from a Shader Graph in Reality Composer Pro) to the mesh and use calls to setParameter to dynamically update the material on very rapid frequency. The mesh is locked (no more updates) before the calls to setParameter. This process works for a few minutes but then eventually I get the following error in the console: assertion failure: Index out of range (operator[]:line 789) index = 13662, max = 1 With the following stack trace: Thread 1 Queue : com.apple.main-thread (serial) #0 0x00000002880f90d0 in __abort_with_payload () #1 0x000000028812a6dc in abort_with_payload_wrapper_internal () #2 0x000000028812a710 in abort_with_payload () #3 0x0000000288003f40 in _os_crash_msg () #4 0x00000001dc9ff624 in re::ecs2::ComponentBucketsBase::addComponent () #5 0x00000001dc9ffadc in re::ecs2::ComponentBucketsBase::moveComponent () #6 0x00000001dc8b0278 in re::ecs2::MaterialParameterBlockArrayComponentStateImpl::processPreparingComponents () #7 0x00000001dc8b05e4 in re::ecs2::MaterialParameterBlockArraySystem::update () #8 0x00000001dd008744 in re::Scheduler::executePhase () #9 0x00000001dc032ec4 in re::Engine::executePhase () #10 0x0000000248121898 in RCPSharedSimulationExecuteUpdate () #11 0x00000002264e488c in __59-[MRUISharedSimulation _doJoinWithConnectionContext:error:]_block_invoke.44 () #12 0x0000000268c5fe9c in _UIUpdateSequenceRunNext () #13 0x00000002696ea540 in schedulerStepScheduledMainSectionContinue () #14 0x000000026af8d284 in UC::DriverCore::continueProcessing () #15 0x00000001a1bd4e6c in CFRUNLOOP_IS_CALLING_OUT_TO_A_SOURCE0_PERFORM_FUNCTION () #16 0x00000001a1bd4db0 in __CFRunLoopDoSource0 () #17 0x00000001a1bd44f0 in __CFRunLoopDoSources0 () #18 0x00000001a1bd3640 in __CFRunLoopRun () #19 0x00000001a1bce284 in _CFRunLoopRunSpecificWithOptions () #20 0x00000001eff12d2c in GSEventRunModal () #21 0x00000002697de878 in -[UIApplication _run] () #22 0x00000002697e33c0 in UIApplicationMain () #23 0x00000001b56651e4 in closure #1 (Swift.UnsafeMutablePointer<Swift.Optional<Swift.UnsafeMutablePointer<Swift.Int8>>>) -> Swift.Never in SwiftUI.KitRendererCommon(Swift.AnyObject.Type) -> Swift.Never () #24 0x00000001b5664f08 in SwiftUI.runApp<τ_0_0 where τ_0_0: SwiftUI.App>(τ_0_0) -> Swift.Never () #25 0x00000001b53ad570 in static SwiftUI.App.main() -> () () #26 0x0000000101bc7b9c in static MetalRendererApp.$main() () #27 0x0000000101bc7bdc in main () #28 0x0000000197fd0284 in start () Any advice on how to solve this or prevent the error? Thanks!
1
0
303
Jul ’25