I was starting to test visionOS SDK on an existing project that has been running fine on iPad (iOS 17) with Xcode 15. It can be configured to run on visionOS simulator on a MacBook that runs M1 chip without any change in Xcode’s project Build Settings.
However the Apple Vision Pro simulator doesn’t appear when I run Xcode 15 on Intel MacBook Pro, unless I change the SUPPORTED_PLATFORMS key on the Xcode’s project Build Settings to visionOS.
Although, I can understand that a MacBook pro running M1 / M2 chip would be the ideal platform to run the visionOS simulator, it’s so much better if we can run the visionOS simulator on iPadOS, as it has the same arm64 architecture, and it has all the hardware needed to run camera, GPS, and Lidar.
The Mac is not a good simulator, even though it has an M1 / M2 chip, first of all:
It doesn’t have a dual facing camera (front and back)
It doesn’t have a Lidar
It doesn’t have GPS
It doesn’t have a 5G cellular radio
It’s not portable enough for developers to design use cases around spatial computing
Last but not least, I have problems or not very clear on simulating ARKit with actual camera frames on a VisionPro simulator, while I would estimate this can be simulated perfectly on an iPadOS.
My suggestion is to provide us developers with a simulator that can be run on iPadOS, that will increase developers adoption and improve the design and prototyping phase of apps running on the actual Vision Pro device.
iPad and iOS apps on visionOS
RSS for tagDiscussion about running existing iPad and iOS apps directly on Apple Vision Pro.
Posts under iPad and iOS apps on visionOS tag
96 Posts
Sort by:
Post
Replies
Boosts
Views
Activity
Hi There,
Same data shows different line drawing. What I am missing here ? Each time I click to see chart, its drawing is different.
Chart{
ForEach(data)
{ series in
ForEach(series.data){ entry in
LineMark(
x: .value("Day", entry.day),
y: .value("Amount", entry.amount)
) .foregroundStyle(by: .value("Doc", series.Doc))
.symbol(by: .value("Doc", series.Doc))
PointMark(
x: .value("Day", entry.day),
y: .value("Sales", entry.amount)
)
.foregroundStyle(.purple)
.symbolSize(symbolSize)
}
.lineStyle(StrokeStyle(lineWidth: lineWidth))
}
}
.chartLegend(position: .top)
.chartYScale(domain: .automatic(includesZero: false))
.chartXAxis {
AxisMarks(values: .automatic(roundLowerBound: false, roundUpperBound: false)) { value in
if value.index < value.count - 1 {
AxisValueLabel()
}
AxisTick()
AxisGridLine()
}
}
Hi,
I have an existing iPhone/iPad app that I'm considering for Vision Pro. It's an informational app that shares info. with the watch.
What would be better:
To add a target for the vision pro.
Make a separate app so I can use all of the VP's features.
What are YOU doing to support the Vision Pro?
Thanks,
Dan
How do you force quit apps on the Vision Pro simulator?
I've read online you can double press the digital crown on a real Vision Pro device but there is no such button on the simulator. So far I've tried
Pressing the home button twice (⇧⌘H)
Pressing the Siri button twice (⌥⇧⌘H)
None of them works
I'm on Xcode 15.0 beta 5 (15A5209g) & visionOS 1.0 beta 2 (21N5207e)
我下载了最新的Xcode Version 15.0 beta 5 (15A5209g)版本,准备在vision pro 虚拟机上运行我们现有的iOS app,但是运行的时候报错,vision pro的版本是1.0,我们的项目最低支持的版本是iOS 10.0,目前项目无法运行,请问有其他开发者遇到这个问题吗?我需要你们的帮助
我下载了最新的Xcode版本15.0测试版5(15A5209g)在Vision pro虚拟机上运行我们现有的iOS应用程序,但运行时出现了一个错误,vision pro版本1.0,我们项目的最低支持版本是iOS 10.0,目前该项目无法运行,请问其他开发人员是否遇到过这个问题吗?我需要你的帮助
Apple Vision Pro runs visionOS 1.0, which is lower than 小商品城’s minimum deployment target of 10.0. Change your project’s minimum deployment target or upgrade Apple Vision Pro’s version of visionOS.
My team is planning on applying for Developer Lab in Tokyo for September sessions, and I would very much appreciate the advice!
How long does it take for Apple to give the approval/acceptance after applying for the session?
How much time am I allowed for each sessions?
it says the Developer Lab sessions will be held from 10 AM to 5 PM, does that mean I get to test my app using the Vision Pro during the entire period?
How does the session usually take place?
I got the impression that someone from Apple will be assigned to guide and assist me during the session, is this true? Also how many teams are allowed to visit in each sessions?
If anyone know anything, please let me know because I am clueless.
Will Apple Vision Pro support medical imaging diagnostic software such as Horos or Osirix. I am really curious if radiologist will be able to view and manipulate. interpret exams with this headset with ease and use built in microphone for voice recognition/dictation while using hands to manipulate the imaging and also simultaneously being able to view report in field of view. This could really unlock some major potential for interpreting at least CT, Ultrasound and MRI exams. I don't think the resolution will be high enough to interpret diagnostic x-rays and definitely not mammograms due to MQSA regulations and physicist inspections requiring more detail and information. However, I want to be at the forefront of bringing in Vision Pro headsets in the medical imaging space with rel utilization in clinical practices. It may also be beneficial for patients who are curious to see their medical imaging or even as headsets to use while undergoing medical imaging outpatient biopsy procedures for ******/etc to help put them at ease during the procedures. This could really provide patient satisfaction and I think we are scratching the surface on a world of possibilities in healthcare with devices like this. Upon utilixation, I would like to creat presentation series and share information with my Radiology colleagues at national/international meetings.
I want to disable taking screenshots of confidential information but it is not working in iOS 17 beta .
For earlier version I am using ‘isSecureTextEntry' but it is unstable for the recent beta . I am still able to detect the screenshot but not prevent it by using UIApplication.userDidTakeScreenshotNotification .
Is there any way to achieve it ?
I am trying to implement a feature to play video in full immersive space but unable to achieve desired results. When I run app in full immersive space, it shows video in center of screen/window. See screenshot below.
Can you please guide/refer me how to implement a behaviour to play video in full immersive space, just like below image
Regards,
Yasir Khan
There is a comment at https://developer.apple.com/forums/thread/732457?login=true that one user had success getting the Apple Vision Pro Simulator to run on his iPad.
I saw this and tried it but failed.
How is this possible?
I have an app that is now compatible to run on Vision OS iPad Mode, I want to show an immersive view 360 Video playing as background of that app
This app was already built for iOS platform & now is compatible to run on Vision OS
and I want to change background view for Vision OS platform without creating separate target for Vision OS.
Regards,
Yasir Khan
I read that room plan does not work in AVP even if you port an ios app that uses it as a window app in the headset it will not run?
From what I understand RoomPlan uses Lidar which AVP will have so the hardware is there. Whats the reason for not making this avaiable for AVP and/or is there any actual plan to have it in future OS release post launch?
I cant run my visionOS project with Xcode 15 RC, Apple Vision Pro target has disappeared I cant install on location settings tab and if I try to create new project visionOS app dont appear.
Any solution ?
PD: My visionOS project run successfully on beta 6 and 8.
Light Intro IOS Swift Vision Pro - ZIPZY GAMES
Hello! Could please someone help me with such question: in Xcode I could see that there are two possible destinations for Apple Vision - Apple vision vs Apple vision (Designed for iPad). So while I tried to test my app, I noticed that it is possible that my app crashes on Apple Vision for visionOs sdk, but runs on the designed for iPad version and when launched they looks differently.
Does it mean that if I want to make sure that my app works correctly for real device, then I should test it on the Apple Vision for visionOs sdk?
iOS16 above
0 CoreFoundation
___exceptionPreprocess + 164
1 libobjc.A.dylib
_objc_exception_throw + 60
2 Foundation
-[NSMutableDictionary(NSMutableDictionary) initWithContentsOfFile:]
3 UIKitCore
___UIKIT_DID_NOT_RECEIVE_A_REMOTE_CACONTEXT_FROM_COREANIMATION_INDICATING_A_POSSIBLE_BACKBOARDD_CRASH + 484
4 UIKitCore
___UIKIT_IS_REQUESTING_A_CACONTEXT_FROM_COREANIMATION + 64
5 UIKitCore
+[_UIContextBinder createContextForBindable:withSubstrate:] + 400
6 UIKitCore
-[_UIContextBinder _contextForBindable:] + 112
7 UIKitCore
-[_UIContextBinder updateBindableOrderWithTest:force:] + 304
8 UIKitCore
-[_UIContextBinder createContextsWithTest:creationAction:] + 80
9 UIKitCore
-[UIWindowScene _prepareForResume] + 148
10 UIKitCore
-[UIScene _emitSceneSettingsUpdateResponseForCompletion:afterSceneUpdateWork:] + 756
11 UIKitCore
-[UIScene scene:didUpdateWithDiff:transitionContext:completion:] + 244
12 UIKitCore
-[UIApplicationSceneClientAgent scene:handleEvent:withCompletion:] + 336
13 FrontBoardServices
-[FBSScene updater:didUpdateSettings:withDiff:transitionContext:completion:] + 420
14 FrontBoardServices
___94-[FBSWorkspaceScenesClient _queue_updateScene:withSettings:diff:transitionContext:completion:]_block_invoke_2 + 152
15 FrontBoardServices
-[FBSWorkspace _calloutQueue_executeCalloutFromSource:withBlock:] + 168
16 FrontBoardServices
___94-[FBSWorkspaceScenesClient _queue_updateScene:withSettings:diff:transitionContext:completion:]_block_invoke + 344
17 libdispatch.dylib
__dispatch_client_callout + 20
18 libdispatch.dylib
__dispatch_block_invoke_direct + 264
19 FrontBoardServices
___FBSSERIALQUEUE_IS_CALLING_OUT_TO_A_BLOCK__ + 52
NSInternalInconsistencyException
Failed to create remote render context
Lifecycle
09-16 12:49:47.944 UIApplication WillResignActive
09-16 12:49:48.484 UIApplication DidEnterBackground
09-16 12:53:38.998 UIApplication WillEnterForeground
It seems, app becomes inactive before it 'willEnterForeground',then crash,I do not have any ideas of what happen.
Hi, I have an iOS app minimum iOS target 15.0 and trying to make compatible for visionPro. I have single target and using more then 10 third party frameoworks in iOS app like (facebook,googlesignin,and few others) which doesn't support visionOS till yet.
My question is how I can strip out all those 3rd party libraries which are not compatible for visionOS but included/imported for iOS? Currently, if I strip out libraries and put conditional checks in swift files based on target, it would create a mess in whole app.
Should I create separate target/project for visionOS and develop app without using those unsupported libraries? Will apple allow me to upload separate apps for iOS & visionPro. This approach would have alot of duplication in code and very difficult to scale.
Any thoughts, best approach to handle this use-case please.
Regards,
Yasir
I'm trying to create a feature in my app vision OS app where i show a reality view and on button click toggle different entities as in showing them on button click and vice versa.
Is this possible in Vision os? if so how can i do this ?
All that I did now was to instanciate my scene which contains a car 3d model, red tyres and blue tyres.
On a button click i'm trying to show the blue tyres instead of the red ones.
Is this possible ?
Thank you,
Hello,
I am developing a VisionOS based application, that uses the various data providers like Image Tracking, Plane Detection, Scene Reconstruction but these are not supported on VisionOS Simulator. What is the Work Around for this issue ?
Hey everybody!
Does anyone know if there's a UIKit API that's equivalent to SwiftUI's View/offset(z:)?
https://developer.apple.com/documentation/swiftui/view/offset(z:)
I'd like to translate a UIView forward in space a little bit on visionOS.
❤️