iPad and iOS apps on visionOS

RSS for tag

Discussion about running existing iPad and iOS apps directly on Apple Vision Pro.

Posts under iPad and iOS apps on visionOS tag

96 Posts
Sort by:

Post

Replies

Boosts

Views

Activity

[VisionOS] Entity spawning during the runtime won't respond to my gesture
I'm creating an immersive experience with RealityView (just consider it Fruit Ninja like experience). Saying I have some random generated fruits that were generated by certain criteria in System.update function. And I want to interact these generated fruits with whatever hand gesture. Well it simply doesn't work, the gesture.onChange function isn't fire as I expected. I put both InputTargetComponent and CollisionComponent to make it detectable in an immersive view. It works fine if I already set up these fruits in the scene with Reality Composer Pro before the app running. Here is what I did Firstly I load the fruitTemplate by: let tempScene = try await Entity(named: fruitPrefab.usda, in: realityKitContentBundle) fruitTemplate = tempScene.findEntity(named: "fruitPrefab") Then I clone it during the System.update(context) function. parent is an invisible object being placed in .zero in my loaded immersive scene let fruitClone = fruitTemplate!.clone(recursive: true) fruitClone.position = pos fruitClone.scale = scale parent.addChild(fruitClone) I attached my gesture to RealityView by .gesture(DragGesture(minimumDistance: 0.0) .targetedToAnyEntity() .onChanged { value in print("dragging") } .onEnded { tapEnd in print("dragging ends") } ) I was considering if the runtime-generated entity is not tracked by RealityView, but since I have added it as a child to a placeholder entity in the scene, it should be fine...right? Or I just needs to put a new AnchorEntity there? Thanks for any advice in advance. I've been tried it out for the whole day.
2
1
740
Jan ’24
Combine Master-Detail View and Tab View in iPad
Hello, I want to create a UI in which there are 4 main tabs representing individual views/screen (Just like the one in AppStore app). Now for some of the screens, I want to show the content of it using Master-Detail view - Just like iPad Messaging app where message list is in the master view and chatting is in the detail view. The TabView must remain visible to the users to navigate through all 4 views/screens. To show both, TabView and Master-Detail view, in one screen, I get to see 2 options. Show the TabView in MasterView section. In my opinion, this option is fundamentally a malpractice in perspective of UX. Because TabView is representation of shortcut of all 4 screens and user will use them to navigate through different screens, we must not restrict TabView in any other view. TabView must be placed at the top most hierarchy of the application. Keep the TabView in the top-most hierarchy and show in full width like it is shown in the AppStore app, and when user taps of any of the tab, the view that appears would have the Master-Detail View. Now in this one, I have concern of me manipulating the concept of Layout designs and worse UX. I am looking for the right way to implement such functionality with a good and UI/UX. Thank you.
0
0
467
Oct ’23
How do load animated USDZs via Reality Composer Pro versus just doing it in code with raw files?
Creating an Entity and then changing between different animations (walk, run, jump, etc.) is pretty straightforward when you have individual USDZ files in your Xcode project, and then simply create an array of AnimationResource. However, I'm trying to do it via the new Reality Composer Pro app because the docs state it's much more efficient versus individual files, but I'm having a heck of a time figuring out how exactly to do it. Do you have one scene per USDZ file (does that erase any advantage over just loading individual files)? One scene with multiple entities? Something else all together? If I try one scene with multiple entities within it, when I try to change animation I always get "Cannot find a BindPoint for any bind path" logged in the console, and the animation never actually occurs. This is with the same files that animate perfectly when just creating an array of AnimationResource manually via individual/raw USDZ files. Anyone have any experience doing this?
0
1
748
Oct ’23
Downgrade iPad OS
I have several ipads that have been upgraded to 17.0.3 but I need to be able to back them up to 16.6.1 version. We have apps that do not work currently on 17. I have downloaded the 16.6.1 .ipsw file and every time I try to use it I get OS cannot be restored on "iPad". Personalization failed. Any way to get an os file that would work?
0
0
376
Oct ’23
iOS Xcode - ABPKPersonIDTracker not supported on this device
I am trying to use Vision framework in iOS but getting below error in logs. Not able to find any resources in Developer Forums. Any help would be appreciated! ABPKPersonIDTracker not supported on this device Failed to initialize ABPK Person ID Tracker public func runHumanBodyPose3DRequest() { let request = VNDetectHumanBodyPose3DRequest() let requestHandler = VNImageRequestHandler(url: filePath!) do { try requestHandler.perform([request]) if let returnedObservation = request.results?.first { self.humanObservation = returnedObservation print(humanObservation) } } catch let error{ print(error.localizedDescription) } }
3
0
625
Nov ’23
Build setting variant for Designed for iPad
We're building one of our iPad apps for "Apple Vision Pro (Designed for iPad)" simulator and need to have certain different build settings for that simulator variant, compared to the default iOS simulator variant. It seems both variants use the iphonesimulator SDK. And in the Xcode build settings, the "Any iOS Simulator SDK" variant of any setting is what is also used, even when building for "Apple Vision Pro (Designed for iPad)". We tried adding a variant to the "Any iOS Simulator SDK" variant for one of the settings by editing the project file directly, using some possible variants we could think of. The added variant does show in the Xcode UI but it shows as "Any iOS Simulator SDK" also. But when building for "Apple Vision Pro (Designed for iPad)" Xcode still uses the original variant, as if it doesn't see that the added variant is matching the build destination. Example, where we want to exclude arm64 for iOS simulator builds, but not for "Apple Vision Pro (Designed for iPad)" builds. Original "Any iOS Simulator SDK" variant: "EXCLUDED_ARCHS[sdk=iphonesimulator*]" = "arm64"; Added variant: "EXCLUDED_ARCHS[sdk=iphonesimulator*][variant=Apple Vision Pro]" = ""; We have tried using variant names such as 'xr', 'xors', 'vision', 'visionos' and of course 'Apple Vision Pro' to no avail. The variant shows in the UI but the build doesn't use it. Does anyone know if there is a variant (or other property or other way of distinguishing such variant) that we can use that Xcode will recognize and use when building for "Apple Vision Pro (Designed for iPad)"? Thanks.
1
5
665
Nov ’23
View frame / bounds incorrect (iOS app on visionOS)
I'm running into an issue with the frame bounds of a Metal-based iOS app on the visionOS simulator. Here's a snapshot: That's the result of downloading Apple's sample code and running it in the simulator (Apple Vision Pro (Designed for iPad)). Is it a bug in the simulator / iOS->visionOS emulation, or is that sample code doing something odd that isn't compatible with visionOS? Thanks! Eddy
1
0
459
Nov ’23
I am unable to run my iPad app on the Vision Pro simulator
My device has an M2 Max chip, and I am using Xcode version 15.1 Beta 3. My app runs normally in iOS and iPad simulators, but when I attempt to run it in the Vision Pro simulator, even though the compilation is successful, a dialog box appears stating, 'AppName's architectures (Intel 64-bit) include none that Apple Vision Pro can execute (arm64).' Consequently, the app is not successfully installed in the Vision Pro simulator. Additionally, my project uses Cocoapods for dependency management. I would appreciate any help, thank you!
5
1
2.6k
Jan ’24
Will I ever receive a rejection notice for "Apple Vision Pro developer kit"
I submitted an application for the "Apple Vision Pro developer kit" about a month ago (submitted ~ 11/1/23). I have not heard anything back. If my application is rejected, will I ever receive a rejection notice for "Apple Vision Pro developer kit"? Or will there just be radio silence? Part of my wonders if my application ever got submitted, or if the application doesn't work dependably in Firefox, and if I should submit again. If I haven't received a rejection night this mean my application is still being considered? It would be great to know my application was received.
2
0
576
Dec ’23
Vision Pro Dev Kit question
Hi guys, has any individual develper received Vision Pro dev kit or is it just aimed at big companies? Basically I would like to start with one or 2 of my apps that I removed from the store already, just to get familiar with VisionOS platform and gain knowledge and skills on a small, but real project. After that I would like to use the Dev kit on another project. I work on a contract for mutlinational communication company on a pilot project in a small country and extending that project to VisionOS might be very interesting introduction of this new platform and could excite users utilizing their services. However I cannot quite reveal to Apple details for reasons of confidentiality. After completing that contract (or during that if I manage) I would like to start working on a great idea I do have for Vision Pro (as many of you do). Is it worth applying for Dev kit as an individual dev? I have read some posts, that guys were rejected. Is is better to start in simulator and just wait for actual hardware to show up in App Store? I would prefer to just get the device, rather than start working with the device that I may need to return in the middle of unfinished project. Any info on when pre-orders might be possible? Any idea what Mac specs are for developing for VisionOS - escpecially for 3D scenes. Just got Macbook Pro M3 Max with 96GB RAM, I'm thinknig if I should have maxed out the config. Anybody using that config with Vision Pro Dev kit? Thanks.
0
0
951
Dec ’23
App rejection for 2.3.7 - Performance - Accurate Metadata
My app name "Freepaystubnow" is one word and it's not indicate any price. also with the same name i uploaded this app with different account 2 month ago and it was approved but some reason i delete this app and now i try to upload same app (i change app bundle id) with new account this app but app store reject my app below reason Guideline 2.3.7 - Performance - Accurate Metadata Your app name to be displayed on the App Store include references to the price of your app or the service it provides, which is not considered a part of these metadata items.
1
0
646
Dec ’23
Ads at VisionOS
Hi all, I'm new to Apple Ads Guides, my excuses if topic has been discussed or I missed it in Policies documentation, please give a link if so. Would like to know more about VisionOS and in-app rules for it. Are there any limitations for integration a code of ad networks or non related to app sponsored content into the applications there? What are the options for additional monetization of apps for AR/MR/VR?
1
0
774
Dec ’23
Convert ARKit App (UIKit) to VisionOS RealityKit app
Is there a way of integrating the RealityKitContent to an app created with Xcode12 using UIKit? The non AR parts are working ok in VisionOS, the AR parts need to be rewritten in SwiftUI. In order to be able to do so,I need to access the RealityKit content and be able to work it seamlessly with Reality Composer Pro, but unsure how to integrate RealityKitContent is such pre-SwitftUI/VisionOS project. I am using Xcode 15 Thank you.
0
0
363
Dec ’23
VisionOS RealityView 中全景球体 中放入实体后控制问题
在 Full 模式下, 我创建一球体 半径 10 ,给球添加 CollisionComponent 与 InputTargetComponent 我接着创建一个0.2 正方体 也添加了 上面的两组件 又添加。一个 attrach 的附件信息 代码如下 ` RealityView{content,attachments in let meshgenerate = MeshResource.generateSphere(radius: 10) let collisionShape = ShapeResource.generateSphere(radius: 10 ) var sp = ModelEntity(mesh: meshgenerate) sp.components.set(CollisionComponent(shapes: [collisionShape])) sp.components.set(InputTargetComponent()) sp.transform.scale *= .init(-1, 1, 1) sp.name = "sp" content.add(sp) let ont = ModelEntity(mesh: MeshResource.generateBox(size: 0.2) ) ont.components.set(CollisionComponent(shapes: [ShapeResource.generateBox(size: .init(x: 0.2, y: 0.2, z: 0.2))])) ont.components.set(InputTargetComponent()) ont.name = "ont" ont.position = .init(x: 0, y: 0, z: -2) content.add(ont) if let stack = attachments.entity(for: "aid") { stack.name = "sssssss" stack.setPosition(.init(x: 0, y: 1.5, z: -1), relativeTo: nil) // stack.generateCollisionShapes(recursive: false) //stack.components.set(InputTargetComponent()) content.add(stack) } } attachments: { let rostion = Rotation3D(angle: Angle2D(degrees: 30), axis: .x) Attachment(id: "aid") { Button { print("sss","Button") } label: { Text("New Color") .font(.extraLargeTitle) .padding(40) } .background(.yellow) } } .gesture(TapGesture().targetedToAnyEntity().onEnded({ value in print("sss" ,"TapGesture",value.entity.name) //openwind(id: "main") })) 只有球台可以出发 gesture 其他的 EntityModel 及 附加的信息 都无法触发 gestrue 我知道问题出在 其他实体放到了球内,同时因为球体有 InputTargetComponent 组件我如果想 不求出 InputTargetComponent 情况下 希望他的附件信息也能触发gesture,应该如何解决
0
0
353
Dec ’23