Discuss Spatial Computing on Apple Platforms.

Post

Replies

Boosts

Views

Activity

VISIONOS: how to detect if the user closes a window vs a window going into the background
For context, we have a fully immersive application running on visions. The application starts with a standard View/menu, and when the user clicks an option, it takes you to fully immersive mode with a small floating toolbar window that you can move and interact with as you move around the virtual space. When the user clicks the small x button below the window, we intercept the scenePhase .background event and handle exiting immersive mode and displaying the main menu. This all works fine. The problem happens if the user turns around and doesn't look at the floating window for a couple of minutes. The system decides that the window should go into the background, and the same scenePhase background event is called - causing the system to exit immersive mode without warning. There seems to be no way of preventing this, and no distinction between this and the user clicking the close button. Is there a reliable way to detect if the user intentionally clicks the close button vs the window from going into the background through lack of use? onDisappear doesn't trigger. thanks in advance
1
0
290
3w
The app “SGKJ_3DRealisticModel” has been killed by the operating system because it is using too much memory.
Hello! I encountered a problem when developing a project using VisionOS. I placed a file with a model size of 294.8MB in the project and tried to load the file model using the "FileManager. default. contentsOfDirectory (atPath: resourcesPath). filter {$0. hasSuffix (". usdz ")}" code. However, when the program was loaded, it was directly shut down by the system, prompting me to exceed memory. I would like to ask how to solve this problem?
0
0
108
3w
Apple Vision Pro app stops working after a while
Hello! I have developed an application using Unity that runs on the visionPro. I have correctly built it, installed it on the device and it works. The spatial computing application continue to work after several days (pretty normal, I launch the app and it works. It doesn't use any external services). After several weeks, like a month or so, I launch again the same app but it's not working anymore. The only way I have to make it work again is to rebuild and reinstall it again. What am I missing here? Why an application built and installed few weeks ago suddenly stops working on the VisionPro?
2
0
219
3w
Transparency dial in Immersive mode
In Progressive mode, you can turn the digital crown which will reveal your environment by limiting/expanding the field of view of your Immersive scene. I'm trying to create a different sort of behavior where your Immersive scene remains in 360 mode but adjusting a dial (doesn't have to be the crown, it could be an in-app dial/slider) adjusts the transparency of the scene. My users aren't quite satisfied with the native features that help ensure you aren't about to run into a wall or furniture and want a way of quickly adjusting the transparency on the fly. Is that possible?
0
1
172
2w
GroupSessionJournal attachment loading error on Vision Pro
Hi all, Currently working on a shareplay feature where users pull data from a remote source and are able to share it in a volumetric window with others in the facetime call. However, I am running into an issue where the group activity/session seems to be throwing an error on the recipient of the journal's attachment with the description of notSupported. As I understand it, we use GroupSessionJournal for larger pieces of data like images (like in the Drawing Together example) and in my case 3d models. The current flow goes as follows: User will launch the app and fetch a model from remote. User can start a shareplay instance in which the system captures the volumetric window for users to join and see. At this point, only the original user can see the model. The user can press a button to share this model with the other participants using /// modelData is serialized `Data` try await journal.add(modelData) In the group session configuration, I already have a task listening for for await attachments in journal.attachments { for attachment in attachments { ... } } This task attempts to load data via the following code: let modelData = try await attachment.load(Data.self) /// this is where the error is thrown: `notSupported` I expect the attachment.load(Data.self) call to properly deliver the model data, but instead I am receiving this error. I have also attempted to wrap the model data within an enclosing struct that has a name and data property and conform the enclosing struct to Transferable but that continued to throw the notSupported error. Is there something I'm doing wrong or is this simply a bug in the GroupSessionJournal? Please let me know if more information is required for debugging and resolution. Thanks!
0
0
236
2w
Receiving main camera stream
Hello, I recently got the entitlement for the Enterprise API this week. Although adding the license and the entitlement to the project, I couldn't get any frame from the cameraFrameUpdates. Here are the logs of the authorization and the cameraFrameUpdates [cameraAccess: allowed] CameraFrameUpdates(stream: Swift.AsyncStream<ARKit.CameraFrame>(context: Swift.AsyncStream<ARKit.CameraFrame>._Context)) Could anyone point out what I'm doing wrong in the process?
1
0
199
2w
Disable reverb effect in immersive spaces
I'm developing an app where a user can bring a video or content from a WKWebView into an immersive space using SwiftUI attachments on a RealityView. This works just fine, but I'm having some trouble configuring how the audio from the web content should sound in an immersive space. When in windowed mode, content playing sounds just fine and very natural. The spatial audio effect with head tracking is pronounced and adds depth to content with multichannel or Dolby Atmos audio. When I move the same web view into an immersive space however, the audio becomes excessively echoey, as if a large amount of reverb has been put onto the audio. The spatial audio effect is also decreased, and while still there, is no where near as immersive. I've tried the following: Setting all entities in my space to use channel audio, including the web view attachment. for entity in content.entities { entity.channelAudio = ChannelAudioComponent() entity.ambientAudio = nil entity.spatialAudio = nil } Changing the AVAudioSessionSpatialExperience: And I've also tried every soundstage size and anchoring strategy, large works the best, but doesn't remove that reverb. let experience = AVAudioSessionSpatialExperience.headTracked( soundStageSize: .large, anchoringStrategy: .automatic ) try? AVAudioSession.sharedInstance().setIntendedSpatialExperience(experience) I'm also aware of ReverbComponent in visionOS 2 (which I haven't updated to just yet), but ideally I need a way to configure this for visionOS 1 users too. Am I missing something? Surely there's a way for developers to stop the system messing with the audio and applying these effects? A few of my users have complained that the audio sounds considerably worse in my cinema immersive space compared to in a window.
2
0
295
2w
3DoF Tracking on Vision Pro
Hello everyone, It seems that Vision Pro supports 6DoF tracking, but is it possible to switch to 3DoF tracking? The reason for my question is that I would like to use it while riding in a car, but it seems that the 6DoF tracking is not working well in this situation. I was wondering if switching to 3DoF tracking might solve the issue. Note: I am using the travel mode.
0
0
188
2w
Spatial Video Capturing on iPhone 15 Pro
Hi all, I tried the "isSpatialVideoCaptureEnabled" with AVCaptureMovieFileOutput mentioned in WWDC24: Build compelling spatial photo and video experiences, and it works. But there are some issues and questions: Below codes, the change.newValue always nil so the code seems not work. let observation = videoDevice.observe(\.spatialCaptureDiscomfortReasons) { (device, change) in guard let newValue = change.newValue else { return } if newValue.contains(.subjectTooClose) { // Guide user to move back } if newValue.contains(.notEnoughLight) { // Guide user to find a brighter environment } } AVCaptureMovieFileOutput is support spatial video capturing. May I ask if AVCaptureVideoDataOutput will also support spatial video capturing?
0
2
190
2w
3D Object Capture not working on iphone 12 pro
The 3D object capture feature doesn’t seem to work on my iphone 12 pro. The circle that is supposed to show up when you begin to begin to move around the object doesnt show up so object capture doesn’t even begin. It says ‘more light..’ or ‘move closer’ but this doesnt happen on my iphone 14 pro. Works perfectly fine on that even with the same lighting. How can this be fixed?
1
0
181
2w
How to set the world alignment to gravity and heading for Roomplan?
So in the WWDC23 video on the Roomplan enhancement, it says that it is now possible to set a custom ARSession for the RoomCaptureSession. But how do you actually set the config for the custom ARSession? init() { let arConfig = ARWorldTrackingConfiguration() arConfig.worldAlignment = .gravityAndHeading arSession = ARSession() roomCaptureView = RoomCaptureView(frame: CGRect(x: 0, y: 0, width: 42, height: 42), arSession: arSession) sessionConfig = RoomCaptureSession.Configuration() roomCaptureView.captureSession.delegate = self roomCaptureView.delegate = self } However, I keep getting an issue that self is being used in the property access before being initialised. What can I do to fix it?
0
0
163
2w
EnvironmentResource.generate(fromEquirectangular:) does not compile with Xcode 16.0 beta 2
Hi! It seems that Xcode 16 beta 2 thinks that EnvironmentResource.generate(fromEquirectangular:) is unavailable even when the minimum target remains set to visionOS 1.0 (it is deprecated but Xcode reports a build error). The only way I was able to keep this in place for visionOS 1.x while compiling with Xcode 16 was the following: var environment: EnvironmentResource? if #available(visionOS 2.0, *) { environment = try? await EnvironmentResource(equirectangular: skyBoxWithSun()) } else { fatalError("EnvironmentResource.generate(fromEquirectangular:) does not compile with Xcode 16.0 beta 2.") } #else let environment = try? await EnvironmentResource.generate(fromEquirectangular: skyBoxWithSun()) #endif This will build with both Xcode 15.4 and 16 beta 2, but obviously crash when built with Xcode 16 and run on visionOS 1.x Do I have any better options? I would like to add some visionOS 2.0 features (e.g. try to replace my custom skybox with the new dynamic lighting) but prefer to maintain backward compatibility for now.
1
0
192
1w
Can't Get OrbitAnimation() to work on my project
DESCRIPTION OF PROBLEM I have an Apple Vision Pro App Store app called Starship SE Corps. I'm trying to add an animation for my app so that the starship entity orbits the Earth entity. I'm trying to use OrbitAnimation as discussed in the WWDC23 session "Build Spatial Experiences with RealityKit" (https://developer.apple.com/wwdc23/10080). However, I can't get the animation to work. STEPS TO REPRODUCE I created a sample test app called "SampleOrbitAnimationApp" to focus in on the code I'm having trouble with. When I build and run my sample test app, the app runs on both the visionOS 1.2 simulator and on my real Apple Vision Pro device running visionOS 1.2. However, my starship entity is static and is not animating/orbiting around my Earth entity. I tried putting my OrbitAnimation code in the RealityView update: closure. Doing that, however causes some property scope errors because the entity I refer to in the OrbitAnimation code is my entity that I create in the RealityView code block...so the update: closure code block can't see the entity property. Trying to make the entity reference more global at the top of the ImmersiveView (so update: closure sees the entity property) causes other parameter issues in the .app file call to the ImmersiveView and in the #Preview call to the ImmersiveView. Maybe that should be expected and I would need to workaround that (but I couldn't find a sensible way to do so). If this is the right approach, I need help on how to resolve this across the project files. I did find some example code online where a developer put the OrbitAnimation code directly in the RealityView code block without having an update: or attachments: closure at all. I tried that approach but also couldn't get that to work. The test sample app tries to target the OrbitAnimation and ImmersiveView code I'm struggling with (i.e. I can't get the starship to move and orbit around the Earth). It uses my same production app Package for Starship and Earth entities, built in Reality Composer Pro. Those entities, included in my sample test app, work fine on my latest production App Store release, so I think they are fine. The issue is how to do the OrbitAnimation code for those entities. I realize new capabilities are coming in visionOS 2, but I would like to make OrbitAnimation work now in my visionOS 1.2 app.
9
0
339
1w
App Environment SkyDome's UV values
I started a visionOS app using Apple's new "App Environment" template, and when I looked at the UV mapping for the half SkyDome, the bottom edge had a UV 'Y' value of 0.318. Naively, I had assumed the bottom edge of a half dome would have a UV 'Y' value of 0.5 (half way up the texture map). Is this the standard UV mapping for half a SkyDome? It has caused some issues when I've applied some HDRIs.
0
0
170
1w
Anime with TabView
In vision OS, the tab bar of TabView is outside the window by default. If I switch a page without TabView to a page that needs TabView in my program, the tab bar will suddenly appear on the left side of the screen without any animation. I hope it has an animation when it appears (such as easeIn, move). I tried it in Tab. Other animation-related modifiers such as animation are added under View, but there is no animation in the tab bar. Only the view in the tab has an animation effect, but this is not what I want. What I want is that the tab bar outside the window can have animation. What should I do?
1
0
213
1w