Discuss Spatial Computing on Apple Platforms.

Post

Replies

Boosts

Views

Activity

Presenting immersive content in UIKit app
I have a UIKit app and would like to provide spacial experience when run on VisionOS. I added VisionOS support, but not sure how to present an immersive view. All tutorials are in SwiftUI, but my app is in UIKit. This is an example from a SwiftUI project, but how how do I declare this ImmersiveView in UIKit? struct VirtualApp: App { var body: some Scene { WindowGroup { ContentView() }.windowStyle(.volumetric) ImmersiveSpace(id: "ImmersiveSpace") { ImmersiveView() } } } and in UIKit how do I make the call to open the ImmersiveView?
5
1
1.9k
Jul ’23
How to display stereo images in Apple Vision Pro?
Hi community, I have a pair of stereo images, one for each eye. How should I render it on visionOS? I know that for 3D videos, the AVPlayerViewController could display them in fullscreen mode. But I couldn't find any docs relating to 3D stereo images. I guess my question can be brought up in a more general way: Is there any method that we can render different content for each eye? This could also be helpful to someone who only has sight on one eye.
8
0
4.1k
Jul ’23
Force quit apps on Vision Pro simulator
How do you force quit apps on the Vision Pro simulator? I've read online you can double press the digital crown on a real Vision Pro device but there is no such button on the simulator. So far I've tried Pressing the home button twice (⇧⌘H) Pressing the Siri button twice (⌥⇧⌘H) None of them works I'm on Xcode 15.0 beta 5 (15A5209g) & visionOS 1.0 beta 2 (21N5207e)
6
2
2.8k
Jul ’23
Using Vision Pro in multiple rooms
Suppose I want to use the Vision Pro device in multiple rooms in my home. I have worn the device when I entered my home, checked some notifications on the device, closed the apps. With the device still on my head, I move to my bedroom. Now I want to open some other application without removing the headset and wearing it again. Is this possible?
1
0
738
Jan ’24
VisionOS Continuously Rotate a 3D Object
I want to open a view in my App that contains a Model3D view and I want that object to rotate continuously around the Y axis while it is visible. Is it possible to animate the rotation of a Model3D view? I've tried this code, but the object just sits there and doesn't rotate. import RealityKit import RealityKitContent import SwiftUI struct QuantumComputerArea: View { @State var degreesRotating = 0.0 var body: some View { VStack { Model3D(named: "quantumComputer") { phase in switch phase { case .empty: ProgressView() case let .failure(error): Text(error.localizedDescription) case let .success(model): model .resizable() .scaledToFit() .offset(x: -75, y: 0) .rotation3DEffect(.degrees(degreesRotating), axis: (x: 0, y: 1, z: 0)) @unknown default: fatalError() } //phase } //Model3D .onAppear { withAnimation(Animation.linear(duration: 10).repeatForever(autoreverses: false)) { degreesRotating = 360 } } } //VStack } //View } //View I'm probably missing something simple but if anyone has any suggestions (including use a RealityView) I'd be grateful for the advice.
2
0
1k
Feb ’24
Non-convex collision?
Hi! I think this should be a pretty normal usage of ARKit / RealityKit I have a static mesh for my environment, that I want to have static collision properties. My options for making this interact with dynamic bodies are: ShapeResource.generateConvex(...) -- which overshoots my shape dramatically. Entity.generateCollisionShapes(...) which also overshoots. I notice additional APIs around ShapeResource -- ShapeResource.generateStaticMesh(positions:faceIndices) seems to be exactly what I need. So far, I haven't been able to invoke this successfully to set my collision box. Questions: Is this not, a completely normal thing for developers to want to do? Why is there no support for this out-of-the-box in RealityKit/ARKit? To support this in my app, everywhere I've read has said I need to parse the .obj of my terrain manually, and find triangulated faces and pipe them into this function. That feels like a very standardized process -- and given that RealityKit is already forcing me to use .usdz, why should this not be a part of the SDK? Regardless- I triangulated my terrain mesh, and have been working on parsing code to get the positions and faceIndices for this set up (as an extension on Entity). Is this the right approach? Am I missing something more obvious? Thanks, Justin
6
0
1.1k
Mar ’24
Apple Vision Pro - Apple Store iPad Demo App
Hi! Currently showing the Apple Vision Pro to my clients. Sharing the screen is challenging with Guest Mode on the AVP. You can share the screen but you need to teather the AVP and MBP to an iPhone and then you can AirPlay. A lot of big corporate WiFi networks may not allow AirPlay so teathering is a better option. Does anyone know if Apple is making their AVP demo app available to developers? This would be super helpful for showing off the AVP's capabilities. Thanks! JB
0
0
601
Mar ’24
Meet Object Capture for iOS
Hi ,
We are using Scanning objects using Object Capture app provided by apple it was working fine but suddenly it started crashing while scanning the object with bounding box settings.
Getting device log but showing different reasons for app crash.
Device : iPad Pro / iPhone 15 Pro
iOS : 17.4
Attaching the device log for crash, waiting for your response. -------------------------------------
Translated Report (Full Report Below)
-------------------------------------
Incident Identifier: 0797ADAF-D653-4C92-8AA0-300AA167002B
CrashReporter Key: 48724a3b30ef15e069f513afff7d1aa2e935a520
Hardware Model: iPhone16,1
Process: GuidedCapture [918]
Path: /private/var/containers/Bundle/Application/ACCE6C58-98F0-4DD7-AA6E-732190E0FD30/GuidedCapture.app/GuidedCapture
Identifier: com.example.apple-samplecode.GuidedCaptureH28X75MLUY
Version: 1.0 (1)
Code Type: ARM-64 (Native)
Role: Foreground
Parent Process: launchd [1]
Coalition: com.example.apple-samplecode.GuidedCaptureH28X75MLUY [743]
Date/Time: 2024-03-26 18:23:07.8269 +0530
Launch Time: 2024-03-26 18:20:02.5964 +0530
OS Version: iPhone OS 17.4.1 (21E236)
Release Type: User
Baseband Version: 1.55.04
Report Version: 104
Exception Type: EXC_BREAKPOINT (SIGTRAP)
Exception Codes: 0x0000000000000001, 0x000000022e6577a4
Termination Reason: SIGNAL 5 Trace/BPT trap: 5
Terminating Process: exc handler [918]
Triggered by Thread: 33
Thread 33 name: Dispatch queue: com.apple.coreoc.queues.serial.session
Thread 33 Crashed:
0 CoreOC 0x22e6577a4 0x22e657400 + 932
1 CoreOC 0x22e657588 0x22e657401 + 391
2 CoreOC 0x22e697628 0x22e697221 + 1031
3 CoreOC 0x22e684864 0x22e684431 + 1075
4 CoreOC 0x22e6831ec 0x22e6828d5 + 2327
5 CoreOC 0x22e6c1fb4 0x22e6c1f5d + 87
6 CoreOC 0x22e5ef128 0x22e5ef105 + 35
7 libdispatch.dylib 0x19291113c _dispatch_call_block_and_release + 31
8 libdispatch.dylib 0x192912dd4 _dispatch_client_callout + 19
9 libdispatch.dylib 0x19291a400 _dispatch_lane_serial_drain + 747
10 libdispatch.dylib 0x19291af30 _dispatch_lane_invoke + 379
11 libdispatch.dylib 0x192925cb4 _dispatch_root_queue_drain_deferred_wlh + 287
12 libdispatch.dylib 0x192925528 _dispatch_workloop_worker_thread + 403
13 libsystem_pthread.dylib 0x1e69f8f20 _pthread_wqthread + 287
14 libsystem_pthread.dylib 0x1e69f8fc0 start_wqthread + 7
0
0
738
Mar ’24
Searching for APIs compatible with iOS and VisionOS
Is there a method for finding APIs that are compatible with both iOS and Vision OS (ex. hoverStyle)? I'm encountering difficulties in developing for Vision OS, although I can successfully build for 'Apple Vision (designedForIPad)'. Are there any methods for discovering APIs that support both platforms? I'm looking to enhance my application and would appreciate any guidance on where to find such APIs. Additionally, I'm interested in changing the background to a glass style. However, it seems that this feature may not be supported by the available APIs, particularly those designed for Vision OS. Any suggestions or insights would be greatly appreciated."
1
0
547
Apr ’24
Integrating VisionOS Support into existing SwiftUI iOS App that uses CocoaPods
I am attempting to integrate VisionOS support into my existing iOS app, which utilizes Swift UI and CocoaPods. However, upon adding VisionOS as a supported platform and attempting to run the app, I encounter two errors: "'jot/jot.h' file not found" at "/Users/xxxxxx/Desktop/IOS_DEVELOPMENT/iOS/xxxxxxxxx/xxxxxxxxx-Bridging-Header.h:17:9". "Failed to emit precompiled header" at "/Users/xxxxxx/Library/Developer/Xcode/DerivedData/xxxxxxxxx-bnhvaxypgfhmvqgklzjdnxxbrdhu/Build/Intermediates.noindex/PrecompiledHeaders/xxxxxxxxx-Bridging-Header-swift_6TTOG1OAZB5F-clang_21TRHDW14EDOZ.pch" for bridging header "/Users/xxxxxxxx/Desktop/IOS_DEVELOPMENT/iOS/xxxxxxx/xxxxxxxx-Bridging-Header.h". I'm seeking assistance with resolving these errors. Below is my Podfile configuration: source 'https://github.com/CocoaPods/Specs.git' platform :ios, '15.0' target 'xxxxxxxxxx' do use_frameworks! pod 'RealmSwift' pod 'JGProgressHUD' pod 'BadgeLabel' pod 'jot' pod 'MaterialComponents/Chips' pod 'GoogleMaps' pod 'Firebase/Crashlytics' pod 'Firebase/Analytics' # Firebase pod for Google Analytics # Add pods for any other desired Firebase products # https://firebase.google.com/docs/ios/setup#available-pods end post_install do |installer| installer.pods_project.targets.each do |target| target.build_configurations.each do |config| config.build_settings['IPHONEOS_DEPLOYMENT_TARGET'] = '15.0' end end end Any assistance in resolving these errors would be greatly appreciated.
0
0
626
Apr ’24
Issues while adding VisionOS support to pre existing IOS App that uses swift UI and cocoapods
I am attempting to integrate VisionOS support into my existing iOS app, which utilizes Swift UI and CocoaPods. However, upon adding VisionOS as a supported platform and attempting to run the app, I encounter two errors: "'jot/jot.h' file not found" at "/Users/xxxxxx/Desktop/IOS_DEVELOPMENT/iOS/xxxxxxxxx/xxxxxxxxx-Bridging-Header.h:17:9". "Failed to emit precompiled header" at "/Users/xxxxxx/Library/Developer/Xcode/DerivedData/xxxxxxxxx-bnhvaxypgfhmvqgklzjdnxxbrdhu/Build/Intermediates.noindex/PrecompiledHeaders/xxxxxxxxx-Bridging-Header-swift_6TTOG1OAZB5F-clang_21TRHDW14EDOZ.pch" for bridging header "/Users/xxxxxxxx/Desktop/IOS_DEVELOPMENT/iOS/xxxxxxx/xxxxxxxx-Bridging-Header.h". I'm seeking assistance with resolving these errors. Below is my Podfile configuration: source 'https://github.com/CocoaPods/Specs.git' platform :ios, '15.0' target 'xxxxxxxxxx' do use_frameworks! pod 'RealmSwift' pod 'JGProgressHUD' pod 'BadgeLabel' pod 'jot' pod 'MaterialComponents/Chips' pod 'GoogleMaps' pod 'Firebase/Crashlytics' pod 'Firebase/Analytics' # Firebase pod for Google Analytics # Add pods for any other desired Firebase products # https://firebase.google.com/docs/ios/setup#available-pods end post_install do |installer| installer.pods_project.targets.each do |target| target.build_configurations.each do |config| config.build_settings['IPHONEOS_DEPLOYMENT_TARGET'] = '15.0' end end end Any assistance in resolving these errors would be greatly appreciated.
0
0
598
Apr ’24