If I trigger the apple rating modal in an Immersive space it appears on the ground in (0,0,0) I need it to be in front of the user like push notification perimssion does or other permissions requests.
Search results for
iPhone 16 pro
78,742 results found
Selecting any option will automatically load the page
Post
Replies
Boosts
Views
Activity
Hi, thanks for mentioning the .exp file approach. This works well with the scenario I explained above (dynamic framework depending on static libs). However, when I tried this with a slightly different project configuration, I faced linker errors. I replaced the static libs with dynamic libs. So, now I have a dynamic framework with 3 target dependencies, each being a dynamic library. This project builds and runs fine when tested on the iOS simulator. But according to our other discussion here, when I test this on a real iPhone, it builds fine but fails at runtime with a linker error: 0_abort_with_payload and Xcode shows the following assembly instructions specifying the error Thread 1: signal SIGABRT - dyld`: 0x1aa0a15f8 <+0>: mov x16, #0x209 0x1aa0a15fc <+4>: svc #0x80 -> 0x1aa0a1600 <+8>: b.lo 0x1aa0a1620 ; <+40> 0x1aa0a1604 <+12>: pacibsp 0x1aa0a1608 <+16>: stp x29, x30, [sp, #-0x10]! 0x1aa0a160c <+20>: mov x29, sp 0x1aa0a1610 <+24>: bl 0x
Topic:
Developer Tools & Services
SubTopic:
General
Tags:
Hi, we are implementing ID&V and there is a requirement regarding the flow for Apple Pay. In order to clarify the case I will describe the use case scenario or steps to reproduce first: add a card to the iPhone wallet app (yellow path verification required). Do not complete the ID&V process. add a card to the Watch via the Wallet inside the iPhone Watch app (yellow path verification required). Same as before, do not complete the ID&V complete ID&V process using the Issuer App either from iPhone or Watch. the Issuer app receives the application:openURL:options: callback on its AppDelegate. In the options dictionary, we can not see the UIApplicationOpenURLOptionsSourceApplicationKey populated (it is nil). At this moment, for the card we are adding there are now two tokens, both to be verified via ID&V process. One is on the iPhone and one is on the Apple Watch associated with the same iPhone. The url received at step 4 contains the serial number w
When our Bluetooth device is scanned and a connection is initiated through the app on the iPhone 17, the air log shows that the iPhone sends an LL_LENGTH_REQ to execute the Data Length Update Procedure. However, our peripheral does not support the Bluetooth LE Data Length Extension, so it responds with an LL_UNKNOWN_RSP PDU with the UnknownType field set to LL_LENGTH_REQ. After receiving the LL_UNKNOWN_RSP, the iPhone 17 does not proceed with the subsequent Bluetooth LE service discovery process. The connection is maintained until the peripheral actively disconnects. Once the peripheral disconnects and continues broadcasting Bluetooth signals, the iPhone 17 repeatedly tries to connect to the peripheral and executes the aforementioned process, even if the app has been terminated. According to the Bluetooth 4.2 core specification ([Vol. 6] Part B, Section 5.1.9), which can be found here: https://www.bluetooth.com/specifications/specs/core-specification-amended-4-2/, the iPhone
Hi @377632523@qq.com, I'm unable to reproduce the issue, as reported, in my own Xcode project. Because the provided example above will not build successfully due to missing types, I replaced your implementation with the following: import SwiftUI struct PhotosMainView: View { @State private var searchText: String = var body: some View { TabView { Tab(Library, systemImage: photo.on.rectangle) { NavigationStack { scrollView(colors: [.red, .orange, .yellow]) .navigationTitle(Library) } } Tab(Albums, systemImage: square.grid.2x2) { NavigationStack { scrollView(colors: [.yellow, .green, .teal]) .navigationTitle(Albums) } } Tab(Search, systemImage: magnifyingglass, role: .search) { NavigationStack { scrollView(colors: [.blue, .purple, .pink]) .navigationTitle(Search) .searchable(text: $searchText) } } } .tabBarMinimizeBehavior(.onScrollUp) .tabViewStyle(.sidebarAdaptable) .tabViewBottomAccessory { TimelineAccessoryView() } } // Trivial helper view to visualize scrolling. @ViewBuilder func scrollView(colors: [Color]
Topic:
UI Frameworks
SubTopic:
SwiftUI
Tags:
I am constantly running out of storage on my iPhone 16 Pro. I keep having to move my photos and videos to my laptop and delete them from my phone, and I’m constantly needing to offload apps and manually clear caches in some apps to free up storage. I finally got sick of having this cycle every two weeks so looked into it more closely. I’m finding that iOS consumes 32 GB, and then another system reserve category is consuming an additional 23 GB. Meaning the system reserved files are consuming half of the storage on this phone and effectively making it a 64 GB model. I understand the system will need to consume some capacity for itself and that iOS is getting larger, but nearly 50% of the capacity of the phone is insane. Looking closer into the categories, I’m seeing that iOS has taken it upon itself to also permanently provision 10% of the storage capacity for reserve update space. Already another instance of “why am I having to lose so much of my functional capacit
If you ask for a user action, there may be some possibilities: https://support.apple.com/en-gb/guide/iphone/iph3ff83f3b1/ios But I do not see if white lists are possible or only black lists.
Topic:
Business & Education
SubTopic:
Device Management
Tags:
Hello, If you add a ManipulationComponent to a RealityKit entity and then continue to add instructions, sooner or later you will encounter a crash with the following error message: Attempting to move entity “%s” (%p) under “%s” (%p), but the new parent entity is currently being removed. Changing the parent/child entities of an entity in an event handler while that entity is already being reassigned is not supported. CoreSimulator 1048 – Device: Apple Vision Pro 4K (B87DD32A-E862-4791-8B71-92E50CE6EC06) – Runtime: visionOS 26.0 (23M336) – Device Type: Apple Vision Pro The problem occurs precisely with this code: ManipulationComponent.configureEntity(object) I adapted Apple's ObjectPlacementExample and made the changes available via GitHub. The desired behavior is that I add entities to ManipulationComponent and then Realitiykit runs stably and does not crash randomly. GitHub Repo Thanks Andre
Hi, I am trying to load files from the Apple Vision Pro's storage into a Unity App (using Apple visionOS XR Plugin and not PolySpatial package). So, my immediate question here is what your larger goal here actually is? visionOS generally uses the same file access model as iOS, which means apps get access to files through one of two broad mechanisms: The files are added to one of the app’s container directories. There are many different APIs that use the broad flow, but the simplest case is having your app appear in File.app so that the user can directly add files. Basic access can be enabled by setting UIFileSharingEnabled and (possibly) LSSupportsOpeningDocumentsInPlace. The app uses an API like UIDocumentPickerViewController to allow the user to give their app access to specific files or directories. Finally, apps that are built around documents generally use the approach described in Building a document browser-based app, which actually provides a unified interface for both of the two approaches a
Topic:
Spatial Computing
SubTopic:
General
Tags:
My understanding, AFAIK: if iPhone belongs to the student, that should not be possible, (it would mean any app can get control of your iPhone, with all privacy and security issues it would raise) but if the iPhone is supervised, that should be possible. But likely too heavy constraint on users (and school). Get first look here: https://support.apple.com/en-us/102291#:~:text=If%20your%20iPhone%20or%20iPad%20is%20supervised%2C%20the%20organization%20that,need%20to%20check%20your%20settings.
Topic:
Business & Education
SubTopic:
Device Management
Tags:
I work at a school in NYC and have a software idea that could better support the new NYC phone ban law than current market options (i.e. Yondr pouches). Right now at my school, students and staff scan a QR code upon entering the building to indicate that they are in the building. They scan again on the way out to indicate they've left the building. This is super helpful for attendance, particularly in emergency situations (fire drills, etc). Imagine if when students scanned their QR code, it also activated an app similar to Opal or ScreenZen, but with an admin preset whitelisted apps. The idea is that this app would default deny access to all apps on students' phones except the admin preset whitelisted ones such as Phone, Calculator, etc. Depending on the age/needs of the student, other apps like Spotify, or medical apps could also be whitelisted. My question is -- is this idea possible to create? We would need admin preset controls to create the preset whitelist. We can't have stud
Topic:
Business & Education
SubTopic:
Device Management
Tags:
Community Management
Bundle ID
Device Management
Family Controls
Hi! From https://developer.apple.com/support/xcode/, both Xcode 16.x and Xcode 26 run on macOS Sequoia and support device debugging for watchOS 10.6. Could you add a message/screenshot of your error?
Topic:
Developer Tools & Services
SubTopic:
Xcode
Tags:
We parse the locationID out of the AVCaptureDevice.uniqueID and then find the IORegistry node with that locationID. Is there's a document declares how AVFoundation generate the unique_id for USB camera, no so I can assume this convert will always work? you can't Or is there's a way to send a PTZ control request to AVCaptureDevice? not that I know of. As far as I know, the only way is what you're doing. It looks that the unique_id provided is (locationID<<32|VendorID<<16|ProductID) as hex string, but I'm not sure if I can always assume this behavior won't change. Correct - it has changed in the past, it might change at any time in the future. Not all AVCaptureDevices are UVC, but they all have uniqueIDs. If you would like an API to clearly identify an AVCapture device in the IORegistry, please file a bug. I already did (in 2019, FB6146541)
Topic:
Media Technologies
SubTopic:
Photos & Camera
Tags:
When I start a Simulator (iPhone 13 mini) with iOS 26 and activate Use the Same Keyboard Language as macOS, it still sets the keyboard to US (my Mac keyboard is in German). This makes the Mac keyboard unusable. It looks like a bug, because it clearly ignores the settings. When I type “@”, I get “¬”. Restarting the simulator did nothing, changing the settings back and forth also. BTW: Why does every single update of XCode come with a bug nowadays? I always have to spend half a day after an update to fix a problem I didn't have before. Highly frustrating.
Hi Apple Developer Community, I'm developing an eye-tracking application using ARKit's ARFaceTrackingConfiguration and ARFaceAnchor.blendShapes for gaze detection using Xcode. I'm experiencing several calibration and accuracy issues and would appreciate insights from the community. Current Implementation Using ARFaceAnchor.blendShapes (.eyeLookUpLeft, .eyeLookDownLeft, .eyeLookInLeft, .eyeLookOutLeft, etc.) Implementing custom sensitivity curves and smoothing algorithms Applying baseline correction and coordinate mapping Using quadratic regression for calibration point mapping Issues I'm Facing 1. Calibration Mismatch Red dot position doesn't align with where I'm actually looking Significant offset between intended gaze point and actual cursor position Calibration seems to drift or become inaccurate over time 2. Extreme Eye Movement Requirements Need to make exaggerated eye movements to reach screen edges/corners Natural eye movements don't translate to proportional cursor movement Difficulty reaching certain