Discuss Spatial Computing on Apple Platforms.

Posts under General subtopic

Post

Replies

Boosts

Views

Activity

Disable Object Occlusion on iOS
I’m developing an app using RealityKit and RealityView. On newer iPhones, such as the iPhone 15 Pro, Object Occlusion appears to be enabled by default, which causes 3D entities to be hidden behind real-world objects in the scene. However, I need to disable this behavior to ensure proper rendering of my 3D content. This issue does not occur on older devices like the iPhone 13, where the app works as intended. I haven’t been able to find a solution to explicitly disable object occlusion on the newer devices for RealityView. Any guidance or suggestions to resolve this issue would be greatly appreciated! Thanks!
1
1
570
Dec ’24
Test Vision Pro App on Vision Pro
This might be a very silly question, anyway I tried many ways and didn't find a solution: My Mac mini M4 is basic version which only has 16GB memory, it is very shy for developing Vision Pro application and testing with the simulator (CleanMacX always warning me low memory), I want to debug and test the application directly on Vision Pro (+ my app need both two hands gestures which simulator might not support) in stead of simulator, is there any proper instructions on how I test/debug/run the App on VP device directly instead of on simulator in favor of speed?
1
0
486
Dec ’24
Reality Composer Pro timelines management, but using code
Is it possible to manage the behavior of timeline totally from code? I am exploring the Compose interactive 3D content in Reality Composer Pro sample project after seeing the related video, but the example shows only the use of Behaviors from RCP to activate timelines actions. I was wondering if it is possible to, somehow, retrieve some kind of timeline controller that allows me access to its informations just like the AnimationPlaybackController does with single animations. What I would like to achieve is being able to play/pause/retrieve timestamp from them in order to allow synchronization between different users on SharePlay
1
0
585
Oct ’24
[Vision, visionOS] Is it possible using Vision Framework on visionOS for body tracking feature?
Hello, I checked following documentations. Vision | Apple Developer Documentation Discover Swift enhancements in the Vision framework - WWDC24 - Videos - Apple Developer I saw Vision Framework is available on visionOS. So I want to know that if it's possible using Vision Framework on visionOS for tracking human and animal body poses. Or are there some limits to use this on visionOS?
1
0
474
Jan ’25
How to handle tasks when the Vision Pro is taken off?
I have a grpc server running inside of a task. When the user takes the headset off, the grpc server will no longer work when they put the headset back on. I would like to have this action detected so that I can cancel the task (which will effectively close the grpc server). I am also using a visual indicator to let the user know if the server is running, but it will not accurately reflect the state of the server when removing and putting back on the headset.
1
0
275
Mar ’25
I want to know the update principle of the `RealityView`.
Here is the code snippets. struct RealityViewTestView: View { @State private var texts: [String] = [] var body: some View { RealityView { content, attachments in } update: { content, attachments in for text in texts { if let textEntity = attachments.entity(for: text) { textEntity.position.x = Float.random(in: -0.1...0.1) content.add(textEntity) } } } attachments: { ForEach(texts, id: \.self) { text in Attachment(id: text) { Text(text) .padding() .glassBackgroundEffect() } } } .toolbar { ToolbarItem { Button("Add") { texts.append(String(UUID().uuidString.prefix(6))) } } ToolbarItem { Button("Remove") { texts.remove(at: Int.random(in: 0..<texts.count)) } } } } } struct RealityViewTestView: View { @State private var texts: [String] = [] @State private var entities: [Entity] = [] var body: some View { RealityView { content, attachments in } update: { content, attachments in // for text in texts { // if let textEntity = attachments.entity(for: text) { // textEntity.position.x = Float.random(in: -0.1...0.1) // content.add(textEntity) // } // } for entity in entities { content.add(entity) } } attachments: { ForEach(texts, id: \.self) { text in Attachment(id: text) { Text(text) .padding() .glassBackgroundEffect() } } } .toolbar { ToolbarItem { Button("Add") { //texts.append(String(UUID().uuidString.prefix(6))) let m = ModelEntity(mesh: .generateSphere(radius: 0.1), materials: [SimpleMaterial(color: .white, isMetallic: false)]) m.position.x = Float.random(in: -0.2...0.2) entities.append(m) } } ToolbarItem { Button("Remove") { //texts.remove(at: Int.random(in: 0..<texts.count)) entities.removeLast() } } } } } About the first code snippet, when I remove an element from the texts, why content can automatically remove the corresponding entity? And about the second code snippet, content do not automatically remove the corresponding entity. I am very curious.
1
0
412
Jan ’25
Missing Properties in BillboardComponent
In an earlier beta, BillboardComponent had rotationAxis and upDirection properties which allowed more fine-grained control of how an entity rotates towards the camera. Currently, it is only possible to orient the z axis of the entity. Looking at the robot in the documentation, the rotation of its z axis causes its feet to lift off the ground. Before, it was possible to restrain the rotation to one axis (y, for example) so that the robot's feet stayed on the ground with billboard.upDirection = [0, 1, 0] billboard.rotationAxis = [0, 1, 0] Is there an alternative way to achieve this? Are these properties (or similar) coming back?
1
0
278
Mar ’25
RealityView Limits VisionOS
Im asking myself we are the limits of RealityView. For example is it possible to place an entity on postion (x=800m,y=0,z=-900m) What happens if i walk from my (0,0,0) to this point, will i see the entity then ? Does someone know where are the limits ?
1
0
299
Oct ’24
App crashes after requesting PhotoLibrary limited access
My visionOS requires access to users' personal photos. The trigger mechanism is: when user firstly opens a FooView, a task attached to that FooView and calling let status = PHPhotoLibrary.authorizationStatus(for: .readWrite), if the status is .notDetermined, then calling PHPhotoLibrary.requestAuthorization(for: .readWrite, handler: authCompletionHandler) to let visionOS pop out a window to request Photo access. However, the app crashes every time when user selects Limited Access and the system try to pop out a photo library picker. And btw, I have set Prevent limited photos access alert to Yes, but it shouldn't affect the behavior here I guess. There was a debugger message here: *** Terminating app due to uncaught exception 'NSInternalInconsistencyException', reason: 'Presentations are not permitted within volumetric window scenes.' However, the window this view belongs to is a .plain style window (though there were 3D object appearing in the other view of same windowgroup) This is my code snippet if this helps: checkAndUpdatePhotoAuthorization is just a wrapper of PHPhotoLibrary.authorizationStatus(for: .readWrite) private func checkAndUpdatePhotoAuthorization() -> PHAuthorizationStatus { let currentStatus = PHPhotoLibrary.authorizationStatus(for: .readWrite) switch currentStatus { case .authorized: print("Photo library access authorized.") isPhotoGalleryAuthorized = true isPhotoGalleryLimited = false isPhotoGalleryAccessRestricted = false isPhotoGalleryDetermined = true case .limited: print("Photo library access limited.") isPhotoGalleryLimited = true isPhotoGalleryAuthorized = false isPhotoGalleryAccessRestricted = false isPhotoGalleryDetermined = true case .notDetermined: isPhotoGalleryDetermined = false print("Photo library access not determined.") case .denied: print("Photo library access denied.") isPhotoGalleryAuthorized = false isPhotoGalleryLimited = false isPhotoGalleryAccessRestricted = false showSettingsAlert = true isPhotoGalleryDetermined = true case .restricted: print("Photo library access restricted.") isPhotoGalleryAuthorized = false isPhotoGalleryLimited = false isPhotoGalleryAccessRestricted = true showPhotoAuthExplainationAlert = true isPhotoGalleryDetermined = true @unknown default: print("Photo library Unknown authorization status.") isPhotoGalleryAuthorized = false isPhotoGalleryLimited = false isPhotoGalleryAccessRestricted = false isPhotoGalleryDetermined = true } return currentStatus } And then FooView attaches task to fire up checkAndUpdatePhotoAuthorization() var body: some View { EmptyView() } .task { try? await Task.sleep(for: .seconds(1.0)) let status = self.checkAndUpdatePhotoAuthorization() if status == .notDetermined { DispatchQueue.main.async { PHPhotoLibrary.requestAuthorization(for: .readWrite, handler: authCompletionHandler) } } Another thing worth to mention is that SOMETIMES it won't crash when running on a debug build. But it crashes when it comes to TF. Any other idea? Big thanks in advance XCode version: 16.2 beta 3 VisionOS version: 2.2
1
0
602
Jan ’25
Run to Device (on WLAN)?
I was trying to use a local Wifi router to connect Vision Pro and Mac for developing. From Unity, the host on Vision Pro wouldn't open; from Xcode I can see vision pro paired but by the Run button there's no device listed...thanks any ideas? /Ruiying
1
0
301
Jan ’25
Palm Menu Button Issue
Hi, we have in our app an immersive space and we taught the palm menu button is not available in immersive spaces, but when I look in the hand and tap the menu button appear. Is it possible to keep it hidden? Because we a have an hand tracking feature in palm and when we try to press a button to overlap the palm it triggers the menu button and then when the user presses again by mistake, it sends the application to the background. This is very important for us because we would like to release this hand-tracking feature as soon as possible. Here is a link with to a video with the problem: https://drive.google.com/file/d/1cfOcdzF19h_mbmpvkVNCJjXEBJecVeJL/view?usp=sharing
1
0
604
Sep ’24
AudioPlaybackController stop playing when .plain window is closed
Suppose there was an immersiveSpace, and an Entity() being added to the space as child entity of the content. This entity is responsible for playing background music by calling prepareAudio, gaining a controller and play the music. (check the basic code below) When it was playing music, a .plain window and an immersiveSpace are both presented. I believe this immersiveSpace is holding the handle of the controller so as long as immersiveSpace is open, the music won't stop. However if I close the .plain window (by closing system-level close button), the music just stopped. But the immersiveSpace is still open. If right now I check the value of controller.isPlaying, it was still true. But you just cannot hear the music anymore. To reproduce, simply open an visionOS template App project, selecting volume and full immersive, and replace some code inImmersiveView.swift with the code below. Also simply drag any .mp3 file and replace the AudioFileResource's name. And you could reproduce this bug. RealityView { content in // Add the initial RealityKit content if let immersiveContentEntity = try? await Entity(named: "Immersive", in: realityKitContentBundle) { content.add(immersiveContentEntity) // Put skybox here. See example in World project available at // https://developer.apple.com/ if let audioResource = try? await AudioFileResource(named: "anyMP3file.mp3") { let ent = Entity() immersiveContentEntity.addChild(ent) let controller = ent.prepareAudio(audioResource) controller.play() } } } I wonder why this happen? I mean how should I keep the music playing when I close the .plain window? Thanks!
1
1
503
Jan ’25
Spatial streaming from iPhone
Hi, I am trying to stream spatial video in realtime from my iPhone 16. I am able to record spatial video as a file output using: let videoDeviceOutput = AVCaptureMovieFileOutput() However, when I try to grab the raw sample buffer, it doesn't include any spatial information: let captureOutput = AVCaptureVideoDataOutput() //when init camera session.addOutput(captureOutput) captureOutput.setSampleBufferDelegate(self, queue: sessionQueue) //finally func captureOutput(_ output: AVCaptureOutput, didOutput sampleBuffer: CMSampleBuffer, from connection: AVCaptureConnection) { //use sample buffer (but no spatial data available here) } Is this how it's supposed to work or maybe I am missing something? this video: https://developer.apple.com/videos/play/wwdc2023/10071 gives us a clue towards setting up spatial streaming and I've got the backend all ready for 3D HLS streaming. Now I am only stuck at how to send the video stream to my server.
1
0
768
Oct ’24
What's the relation of SwiftUI frames' sizes and RealityKit Entities sizes
Currently I want to recreate a window which is similar to system window in ImmersiveSpace. But we only can use the meter unit in RealityKit. I create a plane entity, I don't know how to set the size using meter unit to make the plane's size totally consistent with the system window. Also, I want to know the z and y position of the system window in the immersive space.
1
0
322
Jan ’25
RealityView Gestures for iOS
I started a new project using RealityKit and RealityView, intended as an AR app on iPhone and iPad, but eventually VisionOS as well. I'm challenged because I find much of the recent documentations, WWDC videos, etc, include features that are VisionOS only. Right now, I would simply like to create some gesture functionality that is similar to AR Quick Look defaults, meaning drag to reposition, two fingers to rotate or zoom. In the past, this would be implemented with something like: arView.installGestures([.all], for: entity) however, with RealityView I don't know how (or if possible) to access an ARView. In RealityKit, I have found this doc: https://developer.apple.com/documentation/realitykit/transforming-realitykit-entities-with-gestures However, many of the features in that posting are VisionOS only, and I've found no good documentation on the topic that is specific or at least compatible with iOS. I know reverting to an ARView is an option, but I want to use RealityView if at all possible as I see it as more forward-looking.
1
0
373
Jan ’25