Search results for

iPhone 16 pro

78,870 results found

Post

Replies

Boosts

Views

Activity

ManipulationComponent create parent/child crash
Hello, If you add a ManipulationComponent to a RealityKit entity and then continue to add instructions, sooner or later you will encounter a crash with the following error message: Attempting to move entity “%s” (%p) under “%s” (%p), but the new parent entity is currently being removed. Changing the parent/child entities of an entity in an event handler while that entity is already being reassigned is not supported. CoreSimulator 1048 – Device: Apple Vision Pro 4K (B87DD32A-E862-4791-8B71-92E50CE6EC06) – Runtime: visionOS 26.0 (23M336) – Device Type: Apple Vision Pro The problem occurs precisely with this code: ManipulationComponent.configureEntity(object) I adapted Apple's ObjectPlacementExample and made the changes available via GitHub. The desired behavior is that I add entities to ManipulationComponent and then Realitiykit runs stably and does not crash randomly. GitHub Repo Thanks Andre
3
0
409
1w
Reply to Request File Access from Unity for Apple Vision Pro
Hi, I am trying to load files from the Apple Vision Pro's storage into a Unity App (using Apple visionOS XR Plugin and not PolySpatial package). So, my immediate question here is what your larger goal here actually is? visionOS generally uses the same file access model as iOS, which means apps get access to files through one of two broad mechanisms: The files are added to one of the app’s container directories. There are many different APIs that use the broad flow, but the simplest case is having your app appear in File.app so that the user can directly add files. Basic access can be enabled by setting UIFileSharingEnabled and (possibly) LSSupportsOpeningDocumentsInPlace. The app uses an API like UIDocumentPickerViewController to allow the user to give their app access to specific files or directories. Finally, apps that are built around documents generally use the approach described in Building a document browser-based app, which actually provides a unified interface for both of the two approaches a
Topic: Spatial Computing SubTopic: General Tags:
1w
Reply to app to restriction student phone use in schools
My understanding, AFAIK: if iPhone belongs to the student, that should not be possible, (it would mean any app can get control of your iPhone, with all privacy and security issues it would raise) but if the iPhone is supervised, that should be possible. But likely too heavy constraint on users (and school). Get first look here: https://support.apple.com/en-us/102291#:~:text=If%20your%20iPhone%20or%20iPad%20is%20supervised%2C%20the%20organization%20that,need%20to%20check%20your%20settings.
1w
app to restriction student phone use in schools
I work at a school in NYC and have a software idea that could better support the new NYC phone ban law than current market options (i.e. Yondr pouches). Right now at my school, students and staff scan a QR code upon entering the building to indicate that they are in the building. They scan again on the way out to indicate they've left the building. This is super helpful for attendance, particularly in emergency situations (fire drills, etc). Imagine if when students scanned their QR code, it also activated an app similar to Opal or ScreenZen, but with an admin preset whitelisted apps. The idea is that this app would default deny access to all apps on students' phones except the admin preset whitelisted ones such as Phone, Calculator, etc. Depending on the age/needs of the student, other apps like Spotify, or medical apps could also be whitelisted. My question is -- is this idea possible to create? We would need admin preset controls to create the preset whitelist. We can't have stud
4
0
529
1w
Reply to How can I locate a UVC camera for PTZ control by AVCaptureDevice.unique_id
We parse the locationID out of the AVCaptureDevice.uniqueID and then find the IORegistry node with that locationID. Is there's a document declares how AVFoundation generate the unique_id for USB camera, no so I can assume this convert will always work? you can't Or is there's a way to send a PTZ control request to AVCaptureDevice? not that I know of. As far as I know, the only way is what you're doing. It looks that the unique_id provided is (locationID<<32|VendorID<<16|ProductID) as hex string, but I'm not sure if I can always assume this behavior won't change. Correct - it has changed in the past, it might change at any time in the future. Not all AVCaptureDevices are UVC, but they all have uniqueIDs. If you would like an API to clearly identify an AVCapture device in the IORegistry, please file a bug. I already did (in 2019, FB6146541)
1w
Simulator with iOS 26 ignores Mac keyboard language
When I start a Simulator (iPhone 13 mini) with iOS 26 and activate Use the Same Keyboard Language as macOS, it still sets the keyboard to US (my Mac keyboard is in German). This makes the Mac keyboard unusable. It looks like a bug, because it clearly ignores the settings. When I type “@”, I get “¬”. Restarting the simulator did nothing, changing the settings back and forth also. BTW: Why does every single update of XCode come with a bug nowadays? I always have to spend half a day after an update to fix a problem I didn't have before. Highly frustrating.
1
0
39
1w
ARKit Eye Tracking Calibration Issues - Word-Level Reading Tracking Feasibility
Hi Apple Developer Community, I'm developing an eye-tracking application using ARKit's ARFaceTrackingConfiguration and ARFaceAnchor.blendShapes for gaze detection using Xcode. I'm experiencing several calibration and accuracy issues and would appreciate insights from the community. Current Implementation Using ARFaceAnchor.blendShapes (.eyeLookUpLeft, .eyeLookDownLeft, .eyeLookInLeft, .eyeLookOutLeft, etc.) Implementing custom sensitivity curves and smoothing algorithms Applying baseline correction and coordinate mapping Using quadratic regression for calibration point mapping Issues I'm Facing 1. Calibration Mismatch Red dot position doesn't align with where I'm actually looking Significant offset between intended gaze point and actual cursor position Calibration seems to drift or become inaccurate over time 2. Extreme Eye Movement Requirements Need to make exaggerated eye movements to reach screen edges/corners Natural eye movements don't translate to proportional cursor movement Difficulty reaching certain
0
0
632
1w
Zoom navigation causes the source view to disappear
After returning from the child view to the parent, the latter one will simply disappear. This is the full view. See itemsContent where I perform the navigation. The last tapped rectangle in this example will simply disappear. struct DashboardView: View { @State var viewModel: DahsboardViewModel @Namespace private var namespace var body: some View { ScrollView(.vertical) { LazyVStack(spacing: 24) { ForEach(viewModel.sections) { section in VStack(spacing: 16) { Text(section.title) itemsContent(for: section) } } } } } func itemsContent(for section: DashboardSection) -> some View { ForEach(section.items) { item in NavigationLink { PatternLearningRouter().rootView .id(item.id) .navigationTransition(.zoom(sourceID: item.id, in: namespace)) } label: { Rectangle() .fill(Color.yellow) .frame(width: 80, height: 80) .overlay { Text(item.title) } .matchedTransitionSource(id: item.id, in: namespace) } } } } XCode26 26.0.1(17A400) iPhone 16 Pro, iOS 26.0.1 Note: Only reproduced when I swipe ba
0
0
49
1w
Different results depending on compiler options
The following C code shows different results when build in debug or release mode with Xcode 26.0.1. #include static void print_fval(int16_t *src) { float fval = (float)((((int32_t)*src) << 16) - (1 << 31)); printf(fval: %fn, fval); } int main(int argc, const char * argv[]) { int16_t i16[1] = { 0 }; print_fval(i16); return EXIT_SUCCESS; } In debug mode the output is: fval: -2147483648.000000. In release mode the output is: fval: 2147483648.00000. The 1 << 31 might be questionable but I would expect the same outcome in debug or release mode. The difference is in compiling with optimizations or not, compiler option -O0 vs any other optimization option. Would this be considered a compiler/optimizer bug? Older versions of Xcode (15.4 and earlier) do not have this problem.
0
0
51
1w
Request File Access from Unity for Apple Vision Pro
Hi, I am trying to load files from the Apple Vision Pro's storage into a Unity App (using Apple visionOS XR Plugin and not PolySpatial package). So far, I've tried using UnitySimpleFileBrowser and UnityStandaloneFileBrowser (both aren't made for the Vision Pro and don't work there), and then implemented my own naive file browser that at least allows me to view directories (that I can see from the App Sandbox). This is of course very limited: Gray folders can't be accessed, the only 3 available ones don't contain anything where a user would put files through the Files app. I know that an app can request access to these Files & Folders: So my question is: Is there a way to request this access for a Unity-built app at the moment? If yes, what do I need to do? I've looked into the generated Xcode project's Capabilities, but did not find anything related to file access. Any help is appreciated!
5
0
300
1w
How to get an anchored action sheet without the popover arrow on iOS 26?
I see in iPhone built-in apps that action sheets are presented as popovers without arrows over their originating views. Here is an example in Messages and Shortcuts apps. In WWDC 2025 session Build a UIKit app with the new design, the speaker explains that all you have to do is to configurate the popover like we do for iPad. Here is the relevant transcript: 14:33 ActionSheets on iPad are anchored to their source views. Starting in iOS 26, they behave the same on iPhone, appearing directly over the originating view. 14:46 On the alertController, make sure to set the sourceItem or the sourceView on popoverPresentationController, regardless of which device it’s displayed on. Assigning the source view automatically applies the new transitions to action sheets as well! Action sheets presented inline don’t have a cancel button because the cancel action is implicit by tapping anywhere else. If you don’t specify a source, the action sheet will be centered, and you will have a cancel button. iOS 26 p
Topic: UI Frameworks SubTopic: UIKit
2
0
180
1w
Xcode 26 doesn’t recognize ChatGPT Plus subscription and triggers free daily limit
Hi everyone, Since updating to Xcode 26, I’ve been running into an issue with the new ChatGPT integration inside Xcode. Even though I have an active ChatGPT Plus subscription (GPT-4 or GPT-5) and I’m signed in to my OpenAI account, Xcode doesn’t seem to recognize my paid plan. After reaching the free usage limit, I get the following message: “Over daily limit. ChatGPT in Xcode will be unavailable for up to 24 hours. For higher limits, sign in with a paid ChatGPT account or upgrade to ChatGPT Plus.” The issue is that I am already logged in with my ChatGPT Plus account, and everything works fine on the ChatGPT web app — but inside Xcode, it keeps treating me as a free user. Here’s what I’ve tried so far: • Signing out and back into ChatGPT from within Xcode • Restarting Xcode and macOS • Verifying that I’m logged in to ChatGPT in Safari • Updating both macOS and Xcode to the latest versions None of these steps have resolved the issue. It seems like Xcode isn’t syncing properly with OpenAI authentication, and th
2
0
157
1w
IOS 26.1 isSourceTypeAvailable: UIImagePickerControlSourceTypeCamera method keeps returning true when the camera is unavailable
Prerequisite: After the MDM APP issues the command, the camera on the phone is no longer visible (unusable). After upgrading to iOS 26.1, the isSourceTypeAvailable: UIImagePickerControlSourceTypeCamera method keeps returning true when the camera is unavailable. The isSourceTypeAvailable: UIImagePickerControlSourceTypeCamera method on iOS 26.0.1 is normal, returning false when the camera is unavailable and true when it is available.
6
0
322
1w