Discuss spatial computing on Apple platforms and how to design and build an entirely new universe of apps and games for Apple Vision Pro.

All subtopics

Post

Replies

Boosts

Views

Activity

visionOS simulator on the iPadOS device
I was starting to test visionOS SDK on an existing project that has been running fine on iPad (iOS 17) with Xcode 15. It can be configured to run on visionOS simulator on a MacBook that runs M1 chip without any change in Xcode’s project Build Settings. However the Apple Vision Pro simulator doesn’t appear when I run Xcode 15 on Intel MacBook Pro, unless I change the SUPPORTED_PLATFORMS key on the Xcode’s project Build Settings to visionOS. Although, I can understand that a MacBook pro running M1 / M2 chip would be the ideal platform to run the visionOS simulator, it’s so much better if we can run the visionOS simulator on iPadOS, as it has the same arm64 architecture, and it has all the hardware needed to run camera, GPS, and Lidar. The Mac is not a good simulator, even though it has an M1 / M2 chip, first of all: It doesn’t have a dual facing camera (front and back) It doesn’t have a Lidar It doesn’t have GPS It doesn’t have a 5G cellular radio It’s not portable enough for developers to design use cases around spatial computing Last but not least, I have problems or not very clear on simulating ARKit with actual camera frames on a VisionPro simulator, while I would estimate this can be simulated perfectly on an iPadOS. My suggestion is to provide us developers with a simulator that can be run on iPadOS, that will increase developers adoption and improve the design and prototyping phase of apps running on the actual Vision Pro device.
4
2
2.3k
Jun ’23
OBJECT CAPTURE API
Now that we have the Vision Pro, I really want to start using Apple's Object Capture API to transform real objects into 3D assets. I watched the latest Object Capture vid from WWDC 23 and noticed they were using a "sample app". Does Apple provide this sample app to VisionOS developers or do we have to build our own iOS app? Thanks and cheers!
3
0
1.2k
Jun ’23
Reality View 3d objects behind physical objects
I have been playing with RealityKit and ARKit. One thing I am not able to figure out is if it's possible to actually place an object, say on a floor behind a couch and not be able to see it if viewing the area it was place from the other side of the couch. If thats confusing I apologize. Basically I want to "hide" objects in a closet or behind other physical objects. Are we just not there yet with this stuff? Or is there a particular way to do it I am missing? It just seems odd when I place an object then I see it "on top" of the couch from the other side. Thanks! Brandon
2
0
697
Jun ’23
Scene understanding missing from visionOS simulator?
SceneReconstructionProvider.isSupported and PlaneDetectionProvider.isSupported both return false when running in the simulator (Xcode 15b2). There is no mention of this in release notes. Seems that this makes any kind of AR apps that depend on scene understanding impossible to run in the sim. For example, this code described in this article is not possible to run in simulator: https://developer.apple.com/documentation/visionos/incorporating-surroundings-in-an-immersive-experience Am I missing something or is this really the current state of the sim? Does this mean if we want to build mixed-immersion apps we need to wait to get access to Vision Pro hardware?
11
11
2.9k
Jun ’23
how to reduce scanned ARReferenceObject file?
While making ARKit object detection application, the scanned object(ARReferenceObject) is 5~20MB for detecting an object smoothly. Is there a way to reduce this size? Why is it needed? I have more than 200 objects to detect. and if an object takes 5MB, then almost 1GB will be occupied only for my application, which seems not appropriate.
2
0
422
Jul ’23
bug visionOS and the storyboard key in the plist?
I have an older app that is a mix of Swift & Objective-C. I have 2 groups of storyboards for the iPhone and the iPad using storyboard references. There seems to be a bug, when using the Simulator, it is loading the storyboard specified by the key "Main storyboard file base name" and not using the key "Main storyboard file base name (iPad)". I did change the first key to use the iPad storyboard & it then worked as expected in the visionOS simulator. The raw keys are: UIMainStoryboardFile UIMainStoryboardFile~ipad What should I do?
0
0
571
Jul ’23
Moving a Rigged character with Armature Bones Question
Is there a way to move a Rigged Character with its Armature Bones in ARKit/RealityKit? I am trying to do this When I try to move using JointTransform the usdz robot provided in https://developer.apple.com/documentation/arkit/arkit_in_ios/content_anchors/capturing_body_motion_in_3d It gives me the following: I see the documentation on Character Rigging etc. But is the movement through armature bones only available through a third party software. Or can it be done in Reality Kit/Arkit/RealityView? https://developer.apple.com/documentation/arkit/arkit_in_ios/content_anchors/rigging_a_model_for_motion_capture
0
0
739
Jul ’23
How to display stereo images in Apple Vision Pro?
Hi community, I have a pair of stereo images, one for each eye. How should I render it on visionOS? I know that for 3D videos, the AVPlayerViewController could display them in fullscreen mode. But I couldn't find any docs relating to 3D stereo images. I guess my question can be brought up in a more general way: Is there any method that we can render different content for each eye? This could also be helpful to someone who only has sight on one eye.
7
0
2.5k
Jul ’23
How I can change material, like diffuse, on 3D model (.usdz or .obj)?
Hi all. I am new to swift and AR. I'm trying a project on AR and ran into a problem that I can't change the material on the models. With geometry such as a sphere or a cube, everything is simple. Tell me what am I doing wrong? My simple code: @IBOutlet var sceneView: ARSCNView! var modelNode: SCNNode! override func viewDidLoad() { super.viewDidLoad() sceneView.delegate = self sceneView.showsStatistics = true let scene = SCNScene(named: "art.scnassets/jacket.usdz")! modelNode = scene.rootNode.childNode(withName: "jacket", recursively: true) let material = SCNMaterial() material.diffuse.contents = UIImage(named: "art.scnassets/58.png") modelNode.childNodes[0].geometry?.materials = [material] sceneView.scene = scene
0
0
626
Jul ’23