Discuss Spatial Computing on Apple Platforms.

Post

Replies

Boosts

Views

Activity

size of AVplayerViewController in immersive space
I am trying to make the immersive version of AVplayerViewController bigger, but I can't find any information on how I can go about it. It seems that if I wanted to change immersive video viewing experience, only thing I can do is using VideoMaterial and put it on ModelEntity with .generatePlane. is there a way to change video size on immersive mode for AVplayerViewController?
0
0
128
1w
Capturing Perspective Camera View in RealityKit and Integrating SceneKit with RealityKit
Hello, I am currently developing an application using RealityKit and I've encountered a couple of challenges that I need assistance with: Capturing Perspective Camera View: I am trying to render or capture the view from a PerspectiveCamera in RealityKit/RealityView. My goal is to save this view of a 3D model as an image or video using a virtual camera. However, I'm unsure how to access or redirect the rendered output from a PerspectiveCamera within RealityKit. Is there an existing API or a recommended approach to achieve this? Integrating SceneKit with RealityKit: I've also experimented with using SCNNode and SCNCamera to capture the camera's view, but I'm wondering if SceneKit is directly compatible within a RealityKit scene, specifically within a RealityView. I would like to leverage the advanced features of RealityKit for managing 3D models. Is saving the virtual view of a camera supported, and if so, what are the best practices? Any guidance, sample code, or references to documentation would be greatly appreciated. Thank you in advance for your help!
3
0
225
2w
AVCam modified for Spatial Video captureing in WWDC24
I just follow the video and add the codes, but when I switch to spatial video capturing, the videoPreviewLayer shows black. <<<< FigCaptureSessionRemote >>>> Fig assert: "! storage->connectionDied" at bail (FigCaptureSessionRemote.m:405) - (err=0) <<<< FigCaptureSessionRemote >>>> captureSessionRemote_getObjectID signalled err=-16405 (kFigCaptureSessionError_ServerConnectionDied) (Server connection was lost) at FigCaptureSessionRemote.m:405 <<<< FigCaptureSessionRemote >>>> Fig assert: "err == 0 " at bail (FigCaptureSessionRemote.m:421) - (err=-16405) <<<< FigCaptureSessionRemote >>>> Fig assert: "msg" at bail (FigCaptureSessionRemote.m:744) - (err=0) Did I miss something?
4
0
248
2w
Scene Phase issue with VisionOS 2.0
Hello, I am new to swiftUI and VisionOS but I developed an app with a window and an ImmersiveSpace. I want the Immersive space to be dismissed when the window/app is closed. I have the code below using the state of ScenePhase and it was working fine in Vision OS 1.1 but it stopped working with VisionOS 2.0. Any idea what I am doing wrong? Is there another way to handle the dismissal of ImmersiveSpace when my main Window is closed? @main struct MyApp: App { @State private var viewModel = ViewModel() var body: some Scene { @Environment(\.scenePhase) var scenePhase @Environment(\.dismissImmersiveSpace) var dismissImmersiveSpace WindowGroup { SideBarView() .environment(viewModel) .frame(width: 1150,height: 700) .onChange(of: scenePhase, { oldValue, newValue in if newValue == .inactive || newValue == .background { Task { await dismissImmersiveSpace() viewModel.immersiveSpaceIsShown = false } } }) }.windowResizability(.contentSize) ImmersiveSpace(id: "ImmersiveSpace") { ImmersiveView(area: viewModel.currentModel) .environment(viewModel) } } }
3
0
148
2w
Error Loading USDZ File in Vision Pro Application
Hi everyone, I'm working on a Vision Pro application and encountering an issue while trying to load a USDZ file. Here are the details: File Path: /Users/siddharthpatel/Library/Developer/CoreSimulator/Devices/31F10013-50B6-4CEF-9388-9094087FAEBF/data/Containers/Data/Application/EB260F0A-A84F-4E95-876D-08199D2A4998/Documents/hive1.usdz Code: do { try await modelEntityForCollider = ModelEntity(contentsOf: fileURL!) } catch { print("Error loading model: (error)") } Error: Thread 1: Fatal error: Failed to import entity from "/Users/siddharthpatel/Library/Developer/CoreSimulator/Devices/31F10013-50B6-4CEF-9388-9094087FAEBF/data/Containers/Data/ ... ve1.usdz" I've verified that the file path is correct and the USDZ file exists at the specified location. What could be causing this error and how can I resolve it? Thanks in advance for your help! Siddharth
2
0
142
2w
RealityKit for macOS example
I would like to code some RealityViews to run on my Mac first (and then incorporate them in a visionOS project) so that my code/test loop is faster, but I have not been able to find a simple example that supports Mac. Is it possible to have volumes on a Mac? Is there support for using a game controller to move around the RealityView, like in the visionOS simulator?
1
0
166
2w
SharePlay Button
I followed the WWDC video to learn Sharplay. I understood the first creation of seats, but I couldn't learn some of the following content very well, so I hope you can give me a list code. The contents are as follows: I have already taken a seat. struct TeamSelectionTemplate: SpatialTemplate { let elements: [any SpatialTemplateElement] = [ .seat(position: .app.offsetBy(x: 0, z: 4)), .seat(position: .app.offsetBy(x: 1, z: 4)), .seat(position: .app.offsetBy(x: -1, z: 4)), .seat(position: .app.offsetBy(x: 2, z: 4)), .seat(position: .app.offsetBy(x: -2, z: 4)), ] } I hope you can give me a SharePlay Button. After pressing it, it will assign all users in Facetime to a seat with elements quantified in TeamSelectionTemplate. Thank you very much.
2
0
239
2w
Tabletopkit Game between players at same location
Does the current version of TabletopKit support having two or more game players to be at the same physical location? In these cases, the players would not want to see a Facetime persona around the table but instead should be able to see the physical player. Any other remote players would be able to see the personas of those players since they are not at that location. There are a couple of issues in this scenario (shared position of the board, players' location around the table, etc.), but they should be solvable. Thank you!
0
1
120
2w
API for turning regular photos into spatial photos?
With quite some excitement I read about visionOS 2's new feature to automatically turn regular 2D photos into spatial photos, using machine learning. It's briefly mentioned in this WWDC video: https://developer.apple.com/wwdc24/10166 My question is: Can developers use this feature via an API, so we can turn any image into a spatial image, even if it is not in the device photo library? We would like to download an image from our server, convert it on the visionPro on-the-fly and display it as a spatial photo.
3
1
202
2w
How to read back from Spatial Image encoded with HEIC information about which image at which index is left or right?
In the example https://developer.apple.com/documentation/imageio/writing-spatial-photos, we see that for each image encoded with the photo we include the following information: kCGImagePropertyGroups: [ kCGImagePropertyGroupIndex: 0, kCGImagePropertyGroupType: kCGImagePropertyGroupTypeStereoPair, (isLeft ? kCGImagePropertyGroupImageIsLeftImage : kCGImagePropertyGroupImageIsRightImage): true, kCGImagePropertyGroupImageDisparityAdjustment: encodedDisparityAdjustment ], Which will identify which image is left, and which is right, also information about group type = stereo pair. Now, how do you read those back? I tried to implement a reading simply with CGImageSourceCopyPropertiesAtIndex, and that did not work, getting back "No property groups found." func tryToReadThose() { guard let imageData = try? Data(contentsOf: outputImageURL), let source = CGImageSourceCreateWithData(imageData as NSData, nil) else { print("cannot read") return } for i in 0..<CGImageSourceGetCount(source) { guard let imageProperties = CGImageSourceCopyPropertiesAtIndex(source, i, nil) as? [String: Any] else { print("cannot read options") continue } if let propertyGroups = imageProperties[String(kCGImagePropertyGroups)] as? [Any] { // Process the property groups as needed print(propertyGroups) } else { print("No property groups found.") } //print(imageProperties) } } I assume maybe CGImageSourceCopyPropertiesAtIndex expects something as a 3rd parameter. But in the https://developer.apple.com/documentation/imageio/cgimagesource "Specifying the Read Options" I don't see anything related to that.
1
0
191
2w
Drag Gesture in Immersive Spaces with Reality Kit
I've been trying to get the drag gesture up and running so I can move my 3D model around in my immersive space, but for some reason I am not able to move it around. The model shows up in my visionOS 1.0 Simulator, but I can't seem to get it to move around. Would love some help with this and some resources too that would be helpful. Here's a snippet of my Reality View code import SwiftUI import RealityKit import RealityKitContent struct GearRealityView: View { static var modelEntity = Entity() var body: some View { RealityView { content in if let model = try? await Entity(named: "LandingGear", in: realityKitContentBundle) { GearRealityView.modelEntity = model content.add(model) } }.gesture( DragGesture() .targetedToEntity(GearRealityView.modelEntity) .onChanged({ value in GearRealityView.modelEntity.position = value.convert(value.location3D, from: .local, to: GearRealityView.modelEntity.parent!) }) ) } }
2
0
203
2w
Best practices for live-streaming MV-HEVC content?
I was wondering of anyone had guidance on how to “livestream“ MV-HEVC content. More specifically, I have a left and right eye view for stereoscopic content (perhaps, for example, the views were taken from a stereoscopic video being passed through an AVPlayer). I know, based on sample code, that I can convert the stereoscopic video into a MV-HEVC file using AVAssetWriter. However, how would I take the stereoscopic video and encode it, in realtime, to a stream that could then leverage HLS Tools to deliver to clients? Is AVFoundation capable of this directly? Or is there an API within VideoToolbox that can help with this?
0
1
160
2w
Dev documentation search is not accurate/complete
Posting here as I did not see a section for Dev Documentation portal Using the search box in the documentation portal I searched for "frustum" hoping to find any APIs that game me control over frustum culling. https://developer.apple.com/search/?q=frustum&type=Documentation The search came up empty for hits in RealityKit. Hours later I found the boundsMargin API which explains how it affect frustum culling. I went back and tried the search again to verify the documentation search result were incomplete. site:developer.apple.com/documentation/realitykit frustum on google worked fine. Fixing this can save everyone time and stress.
2
0
160
2w