I’m trying to add screenshot for my new app, I did it the same way I have been doing this and it seems like there’s a issue with the website causing it to say that the images are still processing after hitting add to review. Is anyone having this issue, I’ve been trying on 3 different apple devices since 6pm EST.
iPad and iOS apps on visionOS
RSS for tagDiscussion about running existing iPad and iOS apps directly on Apple Vision Pro.
Posts under iPad and iOS apps on visionOS tag
100 Posts
Sort by:
Post
Replies
Boosts
Views
Activity
In visionOS simulator, a ContactPicker for Multiple contacts selection is shown without the Done button. Can I assume this behavior will be OK on an actual Vision Pro? I could not get a list of contacts to be selected.
On iOS, the Done button is shown ok as follows:
import ContactsUI
import Combine
struct ContactPickerView: View {
@State private var pickedNumber: String?
@StateObject private var coordinator = Coordinator()
var body: some View {
VStack {
Button("Open Contact Picker") {
openContactPicker()
}
.padding()
Text(pickedNumber ?? "")
.padding()
}
.onReceive(coordinator.$pickedNumber, perform: { phoneNumber in
self.pickedNumber = phoneNumber
})
.environmentObject(coordinator)
}
func openContactPicker() {
let contactPicker = CNContactPickerViewController()
contactPicker.delegate = coordinator
let scenes = UIApplication.shared.connectedScenes
let windowScenes = scenes.first as? UIWindowScene
let window = windowScenes?.windows.first
window?.rootViewController?.present(contactPicker, animated: true, completion: nil)
}
class Coordinator: NSObject, ObservableObject, CNContactPickerDelegate {
@Published var pickedNumber: String?
func contactPicker(_ picker: CNContactPickerViewController, didSelect contacts: [CNContact]){
print(contacts)
contacts.forEach { contact in
for number in contact.phoneNumbers {
let phoneNumber = number.value
print("number is = \(phoneNumber)")
}
}
}
}
}
I want to run my custom application on my VisionPro.
I have paired my VisionPro with Xcode successfully but when I am running it it shows me to enable developer mode in VisionPro.
When I followed the options and Went to Settings -> Privacy & Security.
There is no developer mode option visible at any place.
Please let me know how I can enable the developer option in VisionPro.
Thanks
I'm taking my iOS/iPadOS app and converting it so it runs on visionOS. I’m trying to compile my app, build it, for both visionOS and iOS. When I try to build for an iPhone and iPad simulator, I get the following error:
 Building for 'iphonesimulator', but realitytool only supports [xros, xrsimulator]
I’m thinking I might need to do a # if conditional compilation statement for visionOS so iOS doesn’t try to build lines of code but I can’t for this particular error find out for which file or code I need to do the conditional compilation. Anyone know how to get rid of this error? 
I am developing an iPhone app, but I've been targeting the AVP, as well. In fact, since I got the AVP, I've mainly be building and running my app on it. This morning, I had an upgrade to Xcode 15.4 (15F31d). Ever since I have not been able to see my AVP as a run destination.
It does show up in the device list, although there are no provisioning files on it for some reason. But I can't target it for building. I've tried unpairing and turning developer mode off and on.
Has anyone else seen this problem after upgrading Xcode? Any help is appreciated.
I have a unity scene which i have created for vision pro and i have also created a biomatric authentication application for vision os using Xcode and swift. What i want to do is call unity scene after the authentication has taken place form the xcode. now i have seen medium post but it only shows how we can do that for apps, I am not bale to do that for vision Pro
I have followed this post : https://medium.com/mop-developers/launch-a-unity-game-from-a-swiftui-ios-app-11a5652ce476
All this i am doing because as far as i know Apple vision pro is not currently supporting optic id authentication with unity's polyspatial plugin.
Any help on this will be appreciated.
Thank you in advace.
Today I have tried to add a second archive action for visionOS. I had added a visionOS destination to my app target a while back and can build and archive my app for visionOS in Xcode 15.3 locally, and also run it on the device.
Xcode Cloud is giving me the following errors in the Archive - visionOS action (Archive - iOS works):
Invalid Info.plist value. The value for the key 'DTPlatformName' in bundle MyApp.app is invalid.
Invalid sdk value. The value provided for the sdk portion of LC_BUILD_VERSION in MyApp.app/MyApp is 17.4 which is greater than the maximum allowed value of 1.2.
This bundle is invalid. The value provided for the key MinimumOSVersion '17.0' is not acceptable.
Type Mismatch. The value for the Info.plist key CFBundleIcons.CFBundlePrimaryIcon is not of the required type for that key. See the Information Property List Key Reference at https://developer.apple.com/library/ios/documentation/general/Reference/InfoPlistKeyReference/Introduction/Introduction.html#//apple_ref/doc/uid/TP40009248-SW1
All 4 errors are annotated with "Prepare Build for App Store Connect" and I get them for both "TestFlight (Internal Testing Only)" and "TestFlight and App Store" deployment preparation options.
I have tried to remove the visionOS destination and add it back, but this is not changing the project at all.
Any ideas what I am missing?
Firstly, everything is ok. I have been connected Apple Vision Pro device to the Xcode via wireless network, also build my app in past several weeks. But since yesterday, I can not connect Apple Vision Pro device to my Xcode anymore. The device did not listed in Devices and Simulators window. I have tried:
Update my Xcode to 15.3
Reboot my Mac and Apple Vision Pro
Reset Apple Vision Pro also erase all data
Other Macs in the same network also did not list any Vision Pro device
I'm sure Vision Pro and Mac are in the same network, and it worked before. I go to Settings - General - Remote Devices, and open Xcode's Devices and Simulators window, still can't see any Apple Vision Pro device.
I am attempting to integrate VisionOS support into my existing iOS app, which utilizes Swift UI and CocoaPods. However, upon adding VisionOS as a supported platform and attempting to run the app, I encounter two errors:
"'jot/jot.h' file not found" at "/Users/xxxxxx/Desktop/IOS_DEVELOPMENT/iOS/xxxxxxxxx/xxxxxxxxx-Bridging-Header.h:17:9".
"Failed to emit precompiled header" at "/Users/xxxxxx/Library/Developer/Xcode/DerivedData/xxxxxxxxx-bnhvaxypgfhmvqgklzjdnxxbrdhu/Build/Intermediates.noindex/PrecompiledHeaders/xxxxxxxxx-Bridging-Header-swift_6TTOG1OAZB5F-clang_21TRHDW14EDOZ.pch" for bridging header "/Users/xxxxxxxx/Desktop/IOS_DEVELOPMENT/iOS/xxxxxxx/xxxxxxxx-Bridging-Header.h".
I'm seeking assistance with resolving these errors. Below is my Podfile configuration:
source 'https://github.com/CocoaPods/Specs.git'
platform :ios, '15.0'
target 'xxxxxxxxxx' do
use_frameworks!
pod 'RealmSwift'
pod 'JGProgressHUD'
pod 'BadgeLabel'
pod 'jot'
pod 'MaterialComponents/Chips'
pod 'GoogleMaps'
pod 'Firebase/Crashlytics'
pod 'Firebase/Analytics' # Firebase pod for Google Analytics
# Add pods for any other desired Firebase products
# https://firebase.google.com/docs/ios/setup#available-pods
end
post_install do |installer|
installer.pods_project.targets.each do |target|
target.build_configurations.each do |config|
config.build_settings['IPHONEOS_DEPLOYMENT_TARGET'] = '15.0'
end
end
end
Any assistance in resolving these errors would be greatly appreciated.
I am attempting to integrate VisionOS support into my existing iOS app, which utilizes Swift UI and CocoaPods. However, upon adding VisionOS as a supported platform and attempting to run the app, I encounter two errors:
"'jot/jot.h' file not found" at "/Users/xxxxxx/Desktop/IOS_DEVELOPMENT/iOS/xxxxxxxxx/xxxxxxxxx-Bridging-Header.h:17:9".
"Failed to emit precompiled header" at "/Users/xxxxxx/Library/Developer/Xcode/DerivedData/xxxxxxxxx-bnhvaxypgfhmvqgklzjdnxxbrdhu/Build/Intermediates.noindex/PrecompiledHeaders/xxxxxxxxx-Bridging-Header-swift_6TTOG1OAZB5F-clang_21TRHDW14EDOZ.pch" for bridging header "/Users/xxxxxxxx/Desktop/IOS_DEVELOPMENT/iOS/xxxxxxx/xxxxxxxx-Bridging-Header.h".
I'm seeking assistance with resolving these errors. Below is my Podfile configuration:
source 'https://github.com/CocoaPods/Specs.git'
platform :ios, '15.0'
target 'xxxxxxxxxx' do
use_frameworks!
pod 'RealmSwift'
pod 'JGProgressHUD'
pod 'BadgeLabel'
pod 'jot'
pod 'MaterialComponents/Chips'
pod 'GoogleMaps'
pod 'Firebase/Crashlytics'
pod 'Firebase/Analytics' # Firebase pod for Google Analytics
# Add pods for any other desired Firebase products
# https://firebase.google.com/docs/ios/setup#available-pods
end
post_install do |installer|
installer.pods_project.targets.each do |target|
target.build_configurations.each do |config|
config.build_settings['IPHONEOS_DEPLOYMENT_TARGET'] = '15.0'
end
end
end
Any assistance in resolving these errors would be greatly appreciated.
Is there a method for finding APIs that are compatible with both iOS and Vision OS (ex. hoverStyle)?
I'm encountering difficulties in developing for Vision OS, although I can successfully build for 'Apple Vision (designedForIPad)'. Are there any methods for discovering APIs that support both platforms? I'm looking to enhance my application and would appreciate any guidance on where to find such APIs.
Additionally, I'm interested in changing the background to a glass style. However, it seems that this feature may not be supported by the available APIs, particularly those designed for Vision OS. Any suggestions or insights would be greatly appreciated."
With IOS 17.4.1 we notice that the restriction for Disallow Airprint is not working, it works on 17.3.1, and the restriction gets pushed down to both devices but does not work with IOS 17.4.1
Looking for an iOS Vision Developer for Private Project
We are seeking a talented iOS Vision Developer to join our team for an exciting private project. The project involves the development of an interactive space modeling application.
Requirements:
Proficiency in iOS development with a strong focus on Vision framework.
Experience in developing interactive applications.
Ability to collaborate effectively within a team environment.
Strong problem-solving skills and attention to detail.
Responsibilities:
Designing and implementing features using the Vision framework.
Collaborating with the team to ensure the smooth integration of features.
Testing and debugging applications to ensure optimal performance.
If you are passionate about iOS development and have experience with the Vision framework, we would love to hear from you.
I want to get only spatial video while open the Photo library in my app. How can I achieve?
One more thing, If I am selecting any video using photo library then how to identify selected video is Spatial Video or not?
self.presentPicker(filter: .videos)
/// - Tag: PresentPicker
private func presentPicker(filter: PHPickerFilter?) {
var configuration = PHPickerConfiguration(photoLibrary: .shared())
// Set the filter type according to the user’s selection.
configuration.filter = filter
// Set the mode to avoid transcoding, if possible, if your app supports arbitrary image/video encodings.
configuration.preferredAssetRepresentationMode = .current
// Set the selection behavior to respect the user’s selection order.
configuration.selection = .ordered
// Set the selection limit to enable multiselection.
configuration.selectionLimit = 1
let picker = PHPickerViewController(configuration: configuration)
picker.delegate = self
present(picker, animated: true)
}
`func picker(_ picker: PHPickerViewController, didFinishPicking results: [PHPickerResult]) {
picker.dismiss(animated: true) {
// do something on dismiss
}
guard let provider = results.first?.itemProvider else {return}
provider.loadFileRepresentation(forTypeIdentifier: "public.movie") { url, error in
guard error == nil else{
print(error)
return
}
// receiving the video-local-URL / filepath
guard let url = url else {return}
// create a new filename
let fileName = "\(Int(Date().timeIntervalSince1970)).\(url.pathExtension)"
// create new URL
let newUrl = URL(fileURLWithPath: NSTemporaryDirectory() + fileName)
print(newUrl)
print("===========")
// copy item to APP Storage
//try? FileManager.default.copyItem(at: url, to: newUrl)
// self.parent.videoURL = newUrl.absoluteString
}
}`
App A want to know if iPhone user is looking at the photo one of the family member and display some information related to that user?
Hi, I am a musician and I tried the vision pro a few days ago. I have ideas to launch the related app on Vision Pro. I would like to find a developer who can help me build up the app :) Thank you all!
Hi. I am trying to implement a drag and drop system using RealityView with a model pinning to a plane in the scene. For that I want to do a raycast from the center of the object I am currently dragging to a plane in the scene. My idea was to convert the object position to screen coordinates first and then do a raycast into the scene from the screen to understand if the object I am dragging is above a certain plane.
The issue I am having is I can't seem to find a way to convert the position of the object in the scene to screen space. Is there an API for that?
I made a model with AssetBundle,and it was based on the iOS platform。In vision pro ,i wanted to load this model, but it faild to load. I need some help .
Hi!
Currently showing the Apple Vision Pro to my clients. Sharing the screen is challenging with Guest Mode on the AVP. You can share the screen but you need to teather the AVP and MBP to an iPhone and then you can AirPlay. A lot of big corporate WiFi networks may not allow AirPlay so teathering is a better option.
Does anyone know if Apple is making their AVP demo app available to developers?
This would be super helpful for showing off the AVP's capabilities.
Thanks!
JB
WindowGroups have ids, like this: WindowGroup(id: "window_id"), but DocumentGroups do not have. So opening like this does not work: openWindow(id: "window_id").
The problem is that if the user dismisses the DocumentGroup, there is no way to reopen it again.