iOS is the operating system for iPhone.

Posts under iOS tag

200 Posts

Post

Replies

Boosts

Views

Activity

iOS App Crashes after install but not when running from Xcode
I have an odd issue I'm trying to troubleshoot. I have an app that is deployed to our enterprise and works on almost all of our devices. Lately, I have devices (very small number) where the app installs from our MDM and it crashes upon launch, it does not get past the launch screen. If I remove and reinstall the app from the MDM it still crashes. I decided to put the devices in developer mode and if I run the App from Xcode on the device it works fine. If I stop it and run it right from the device it also works. If I do a final build of the app and install it using "Devices and Simulators", the app crashes upon launch. Using "Devices and Simulators" I check for crash log for the app and no file. I am stumped as to whats going on.
7
0
216
Oct ’25
Unable to write to file system when building for My Mac (Designed for iPad)
Our app is unable to write to its own sandbox container on macOS when run via “My Mac (Designed for iPad)”. This is not an issue when the app runs on iPhone or on iPad. This seems to affect all attempts to write to the file system including: UserDefaults Core Data (SQLite) Firebase (Analytics, Crashlytics, Sessions) File creation (PDFs, temp files, etc.) We're seeing the following errors in the console: Operation not permitted / NSCocoaErrorDomain Code=513: Permissions error when writing to disk. CFPrefsPlistSource: Path not accessible: Failure to write to UserDefaults. Cannot synchronize user defaults to disk: UserDefaults write blocked. CoreData: No permissions to create file: Core Data SQLite store can't be created. Firebase: Failed to open database: Firebase can't initialize local storage. CGDataConsumerCreateWithFilename: failed to open ... for writing: PDF generation fails due to temp directory access issues. Created a test project to try and reproduce the issue but unable to do so in the test project, even when setting all the build settings the same as the project having issues.
2
0
256
Oct ’25
Navigation Bar Occupies Too Much Space in iOS 26 Landscape Orientation
I’m really frustrated with iOS 26. It was supposed to make better use of screen space, but when you combine the navigation bar, tab bar, and search bar, they eat up way too much room. Apple actually did a great job with the new tab bar — it’s smaller, smooth, and looks great when expanding or collapsing while scrolling. The way the search bar appears above the keyboard is also really nice. But why did they keep the navigation bar the same height in both portrait and landscape? In landscape it takes up too much space and just looks bad. It was way better in iOS 18.
2
0
216
Oct ’25
Focusable doesn't work on iPad with external keyboard
I have a custom input view in my app which is .focusable(). It behaves similar to a TextField, where it must be focused in order to be used. This works fine on all platforms including iPad, except when when an external keyboard is connected (magic keyboard), in which case it can't be focused anymore and becomes unusable. Is there a solution to this, or a workaround? My view is very complex, so simple solutions like replacing it with a native view isn't possible, and I must be able to pragmatically force it to focus. Here's a very basic example replicating my issue. Non of the functionality works when a keyboard is connected: struct FocusableTestView: View { @FocusState private var isRectFocused: Bool var body: some View { VStack { // This text field should focus the custom input when pressing return: TextField("Enter text", text: .constant("")) .textFieldStyle(.roundedBorder) .onSubmit { isRectFocused = true } .onKeyPress(.return) { isRectFocused = true return .handled } // This custom "input" should focus itself when tapped: Rectangle() .fill(isRectFocused ? Color.accentColor : Color.gray.opacity(0.3)) .frame(width: 100, height: 100) .overlay( Text(isRectFocused ? "Focused" : "Tap me") ) .focusable(true, interactions: .edit) .focused($isRectFocused) .onTapGesture { isRectFocused = true print("Focused rectangle") } // The focus should be able to be controlled externally: Button("Toggle Focus") { isRectFocused.toggle() } .buttonStyle(.bordered) } .frame(maxWidth: .infinity, maxHeight: .infinity, alignment: .center) } }
1
0
193
Oct ’25
[iOS 26] CLHeading's magneticHeading and trueHeading return travel direction instead of device orientation when user is in motion
Problem Description I am using CLLocationManager to obtain the device's compass heading (direction), and I have encountered an abnormal behavior: When the user is stationary: After calling startUpdatingHeading(), the CLHeading object returned in the locationManager(_:didUpdateHeading:) callback correctly reflects the device’s actual physical orientation (i.e., the direction the top of the device is pointing) in terms of magnetic north / true north, via the magneticHeading and trueHeading properties. When I rotate the device, the heading values change accordingly — this is the expected behavior. But when the user is in motion (e.g., driving a car): Even if I rotate the device, the values of magneticHeading and trueHeading no longer reflect the device’s actual orientation. Instead, they consistently return what appears to be the user's or vehicle's travel direction (forward direction). In other words, the compass behaves as if it is reporting the direction of motion rather than the device’s actual facing direction. Only after the user has completely stopped moving, does rotating the device again result in magneticHeading and trueHeading reflecting the actual device orientation as expected. However, on another device running iOS 16 (iPhone XR), this behavior does not occur — everything works normally. Expected Behavior I expect that regardless of whether the user is moving or not, the CLHeading values returned by CLLocationManager should always represent the physical orientation of the device itself (i.e., which direction the top of the device is pointing), as a standard compass should. Actual Behavior User is stationary, rotating the device: magneticHeading / trueHeading change properly according to the device’s actual orientation User is in motion (e.g., driving):magneticHeading / trueHeading remain fixed to the direction of motion (travel direction), and do not change when the device is rotated User stops moving, then rotates the device:Compass behaves normally again, reflecting the actual device orientation Environment Information iOS Version: iOS 26.0.1 Device Models: iPhone 15 Pro / iPhone 17 Pro Xcode Version: Xcode 26.0.1 Language: Objective-C Questions Is this a known issue in iOS? Are there any related radars or official documentation about it? Have other developers encountered similar issues, especially where CLHeading behaves incorrectly when the user is in motion? Do I need to set any specific parameters in CLLocationManager (such as headingOrientation) to resolve or work around this issue? 🙏 Thank you for your help — any insights, experiences, or official feedback regarding this issue would be greatly appreciated!
0
0
383
Oct ’25
Metal recommendedMaxWorkingSetSize vs actual RAM on iPhone (LLM load fails)
Context I’m deploying large language models on iPhone using llama.cpp. A new iPhone Air (12 GB RAM) reports a Metal MTLDevice.recommendedMaxWorkingSetSize of 8,192 MB, and my attempt to load Llama-2-13B Q4_K (~7.32 GB weights) fails during model initialization. Environment Device: iPhone Air (12 GB RAM) iOS: 26 Xcode: 26.0.1 Build: Metal backend enabled llama.cpp App runs on device (not Simulator) What I’m seeing MTLCreateSystemDefaultDevice().recommendedMaxWorkingSetSize == 8192 MiB Loading Llama-2-13B Q4_K (7.32 GB) fails to complete. Logs indicate memory pressure / allocation issues consistent with the 8 GB working-set guidance. Smaller models (e.g., 7B/8B with similar quantization) load and run (8B Q4_K provide around 9 tokens/second decoding speed). Questions Is 8,192 MB an expected recommendedMaxWorkingSetSize on a 12 GB iPhone? What values should I expect on other 2025 devices including iPhone 17 (8 GB RAM) and iPhone 17 Pro (12 GB RAM) Is it strictly enforced by Metal allocations (heaps/buffers), or advisory for best performance/eviction behavior? Can a process practically exceed this for long-lived buffers without immediate Jetsam risk? Any guidance for LLM scenarios near the limit?
0
0
612
Oct ’25
Could not launch “App” (Xcode26.0 + iPhone6(iOS12))
Problem Details : Could not launch “App” Reproduction Route : Install Xcode26.0 > Connect to iPhone6(iOS12) > Run app We tried this solution but didn't work. To make Xcode 26 recognize and run apps on an iOS 12 physical device, you can manually add the missing device support files by going to /Applications/Xcode.app/Contents/Developer/Platforms/iPhoneOS.platform/DeviceSupport/ on your Mac, where you’ll see folders like 17.0 or 18.0; download the matching iOS 12 folder (for example, 12.4) from the community-maintained repository.
1
0
184
Oct ’25
iOS App Exists after launch
Hello, my iOS apps are exiting right after launch on a few of our iOS devices. I tried a couple of my apps that are deployed to our fleet and they do the same thing. If I run the app(s) in the Simulator it works fine and if I run the app(s) on the offending devices it works fine as well. Once I stop the run in Xcode the app on the device will not launch. I'm thinking something is missing like a certificate etc. Just not sure. Any ideas on how to troubleshoot this? I would really like to get this fixed.
3
0
411
Oct ’25
Bluetooth connection unexpectedly timing out with macOS Sequoia
After the macOS Sequoia update, my app seems to have an issue with Bluetooth communication between macOS and iOS that uses CoreBluetooth for Central-Peripheral communication. Setup: The iPhone (in my case: iPhone 14 Pro with iOS 18.0 (22A3354)) acts as the Central, and the Mac (in my case: 14" MacBook Pro 2023 with macOS 15.0 (24A335)) as the Peripheral. I’ve implemented a mechanism where the Central (iPhone) sends a message to the Peripheral (Mac) every 15 seconds to keep the connection alive (Because it needs to wait for notify characteristic updates). I never noticed this kind of issue before, but with macOS Sequoia I get it permanently. Issue: The connection drops unexpectedly after a period of time (sometimes 20 seconds, sometimes a few minutes) with CBErrorDomain - code 6: The connection has timed out unexpectedly. Sample Code: Peripheral (Mac): ContentView (Peripheral).txt ContentViewModel (Peripheral).txt Central (iPhone): ContentView (Central).txt ContentViewModel (Central).txt Reproduce: I attached sample code including the Central-Sample (for iPhone) and Peripheral-Sample (for Mac). Just run the Peripheral-Sample (after granting Bluetooth permissions). Then run the Central-Sample and select the Mac device in the list After selecting it should connect, discover the service & characteristic and should start writing messages to it. After some time the func centralManager(_ central: CBCentralManager, didDisconnectPeripheral peripheral: CBPeripheral, error: (any Error)?) {should get called with timed out unexpectedly error. Could anyone please look into this issue and advise on whether there’s a known bug or any workaround? Any guidance would be greatly appreciated, as this impacts the stability of Bluetooth communication between the devices. Thanks in advance. Logs: I also ran the console.app during this issue which got these errors (if this is helpful): console_logs.txt
6
4
3.4k
Oct ’25
UIViewController memory leak with modal presentedViewController
Hi everyone, I'm encountering an unexpected behavior with modal presentations in UIKit. Here’s what happens: I have UIViewControllerA (let’s call it the "orange" VC) pushed onto a UINavigationController stack. I present UIViewControllerB (the "red" VC, inside its own UINavigationController as a .formSheet) modally over UIViewControllerA. After a short delay, I pop UIViewControllerA from the navigation stack. Issue: After popping UIViewControllerA, the modal UIViewControllerB remains visible on the screen and in memory. I expected that dismissing (popping) the presenting view controller would also dismiss the modal, but it stays. Expected Behavior: When UIViewControllerA (orange) is popped, I expect the modal UIViewControllerB (red) to be dismissed as well. Actual Behavior: The modal UIViewControllerB remains on screen and is not dismissed, even though its presenting view controller has been removed from the navigation stack. Video example: https://youtube.com/shorts/sttbd6p_r_c Question: Is this the expected behavior? If so, what is the recommended way to ensure that the modal is dismissed when its presenting view controller is removed from the navigation stack? Code snippet: class MainVC: UIViewController { private weak var orangeVC: UIViewController? override func viewDidLoad() { super.viewDidLoad() self.view.backgroundColor = .blue let dq = DispatchQueue.main dq.asyncAfter(deadline: .now() + 1) { [weak self] in let vc1 = UIViewController() vc1.view.backgroundColor = .orange vc1.modalPresentationStyle = .overCurrentContext self?.navigationController?.pushViewController(vc1, animated: true) self?.orangeVC = vc1 dq.asyncAfter(deadline: .now() + 1) { [weak self] in let vc2 = UIViewController() vc2.view.backgroundColor = .red vc2.modalPresentationStyle = .formSheet vc2.isModalInPresentation = true let nav = UINavigationController(rootViewController: vc2) if let sheet = nav.sheetPresentationController { sheet.detents = [.medium()] } self?.orangeVC?.present(nav, animated: true) dq.asyncAfter(deadline: .now() + 1) { [weak self] in self?.navigationController?.popViewController(animated: true) } } } } } Thank you for your help!
0
0
146
Oct ’25
application(_:didFinishLaunchingWithOptions:) launchOptions is nil when app supports scenes
our app support silent push, and we use below code to get if app is launched by silent push: if let remoteNotification = launchOptions?[UIApplication.LaunchOptionsKey.remoteNotification] as? [AnyHashable: Any], let aps = remoteNotification["aps"] as? [AnyHashable: Any], let contentAvailable = aps["content-available"] as? Int, contentAvailable == 1 { isSilentNotification = true } when app is launch and call: application(_:didFinishLaunchingWithOptions:) but when migrate to UIScene, the launchOptions is always nil, and we can not get to know if app is launched by silent push; I read the develop doc: it says: If the app supports scenes, this is nil. Continue using UIApplicationDelegate's application(_:didReceiveRemoteNotification:fetchCompletionHandler:) to process silent remote notifications after scene connection. but the time for method calling application(_:didReceiveRemoteNotification:fetchCompletionHandler:) id too late. So except in didReceiveRemoteNotification method calling: application(_:didReceiveRemoteNotification:fetchCompletionHandler:) are there any other ways to obtain the silent push flag when app is launch and has not didReceiveRemoteNotification call back.
1
0
144
Oct ’25
AVSpeechSynthesisVoice ignores user-selected voices in iOS 26 (Regression)
We've identified a regression in iOS 26.0 and 26.1 Beta 4 where AVSpeechSynthesisVoice(language:) no longer respects user-selected voices from Accessibility settings. Issue: When users select a specific voice in Settings → Accessibility → Spoken Content → Voices, calling AVSpeechSynthesisVoice(language:) returns the system default voice instead of the user's selection. This worked correctly in iOS 18.6.2. Particularly affects: Third-party speech synthesis voices (CereProc, Grammatek, etc.) Apps relying on automatic voice selection based on user preferences Example: // User selected CereProc Heather for en-GB in Accessibility settings let voice = AVSpeechSynthesisVoice(language: "en-GB") print(voice?.name) // iOS 18.6.2: "HEATHER", iOS 26: "Daniel" (system default) Interesting observation: The new Accessibility Reader feature in iOS 26 correctly uses the user-selected voice, but Tap to Speak and the API both ignore the setting. Tested methods: AVSpeechSynthesisVoice(language:) AVSpeechUtterance auto-selection Reflection for new APIs All return the system default voice, not the user's preference. Filed: FB[20271264] Has anyone else encountered this? Any known workarounds to programmatically access the user's preferred voice selection?
4
1
564
Oct ’25
In iOS26, UITableview can't display cell,but cells can be selected!
We have received several cases that our app can not display uitableview cell in iOS26, but users said that they can select cells with single tab and the uitableview didselectcell delegate can response! I have reported a feedback but no response. Does anyone have the same bugs with me? You guys can see that the page is blank, I have a video a user sent to me can proved that he can select cell with gesture. We cannot reproduce the bug and don't konw how to fixed, we think this is the bug with iOS26, so here for some help. This bug block our distribution of new version(support iOS26) This is the feedback https://feedbackassistant.apple.com/feedback/20677046
1
0
199
Oct ’25
How to hide supplementary column alone in three column split view
We are using three column split view as root of our app and wants to hide the supplementary column alone in some cases and behaves like two column split view. With the existing apis we are unable to achieve this since it hides primary column as well and not giving expected results. .hide(.supplementary) setViewController(nil, for: .supplementary) But seen this behavior in Native Notes app when using the View as List and Gallery option. is there any way to achieve this with maintaining three column split view itself ?
1
0
181
Oct ’25
Displaying and working with Favorites in iOS app
New to iOS development and I've been trying to make heads or tails of the documentation. I know there is a difference between the data fields returned from songs from the user library and from the category, but whenever I search on the apple site I can't find a list of each. For example, Im trying to get the releaseDate of a song in my library, but it seems I'll have to cross-query either the catalog entry for the using song.catalogID or the song.irsc but when I try to use them I can't find a cross reference between the two. I'm totally turned around. Also trying to determine if a song in my library has been favorited or not? isFavorited (or something similar) doesn't seem to be a thing. Using this code and trying to find a way to display a solid star if the song has been favorited or an empty one if it's not. Seems like a basic request but I can't find anything on how to do it. I've searched docs, googled, tried. Does apple want us to query the user's Favorited Songs playlist or something? How do I know which playlist that is? I know isFavorited isn't a thing, just using it here so you can see what my intension is: HStack(spacing: 10) { Image(systemName: song.isFavorited ? "star.fill" : "star") .foregroundColor(song.isFavorited ? .yellow : .gray) Image(systemName: "magnifyingglass") }
1
0
228
Oct ’25
iOS App Crashes after install but not when running from Xcode
I have an odd issue I'm trying to troubleshoot. I have an app that is deployed to our enterprise and works on almost all of our devices. Lately, I have devices (very small number) where the app installs from our MDM and it crashes upon launch, it does not get past the launch screen. If I remove and reinstall the app from the MDM it still crashes. I decided to put the devices in developer mode and if I run the App from Xcode on the device it works fine. If I stop it and run it right from the device it also works. If I do a final build of the app and install it using "Devices and Simulators", the app crashes upon launch. Using "Devices and Simulators" I check for crash log for the app and no file. I am stumped as to whats going on.
Replies
7
Boosts
0
Views
216
Activity
Oct ’25
Unable to write to file system when building for My Mac (Designed for iPad)
Our app is unable to write to its own sandbox container on macOS when run via “My Mac (Designed for iPad)”. This is not an issue when the app runs on iPhone or on iPad. This seems to affect all attempts to write to the file system including: UserDefaults Core Data (SQLite) Firebase (Analytics, Crashlytics, Sessions) File creation (PDFs, temp files, etc.) We're seeing the following errors in the console: Operation not permitted / NSCocoaErrorDomain Code=513: Permissions error when writing to disk. CFPrefsPlistSource: Path not accessible: Failure to write to UserDefaults. Cannot synchronize user defaults to disk: UserDefaults write blocked. CoreData: No permissions to create file: Core Data SQLite store can't be created. Firebase: Failed to open database: Firebase can't initialize local storage. CGDataConsumerCreateWithFilename: failed to open ... for writing: PDF generation fails due to temp directory access issues. Created a test project to try and reproduce the issue but unable to do so in the test project, even when setting all the build settings the same as the project having issues.
Replies
2
Boosts
0
Views
256
Activity
Oct ’25
Navigation Bar Occupies Too Much Space in iOS 26 Landscape Orientation
I’m really frustrated with iOS 26. It was supposed to make better use of screen space, but when you combine the navigation bar, tab bar, and search bar, they eat up way too much room. Apple actually did a great job with the new tab bar — it’s smaller, smooth, and looks great when expanding or collapsing while scrolling. The way the search bar appears above the keyboard is also really nice. But why did they keep the navigation bar the same height in both portrait and landscape? In landscape it takes up too much space and just looks bad. It was way better in iOS 18.
Replies
2
Boosts
0
Views
216
Activity
Oct ’25
Focusable doesn't work on iPad with external keyboard
I have a custom input view in my app which is .focusable(). It behaves similar to a TextField, where it must be focused in order to be used. This works fine on all platforms including iPad, except when when an external keyboard is connected (magic keyboard), in which case it can't be focused anymore and becomes unusable. Is there a solution to this, or a workaround? My view is very complex, so simple solutions like replacing it with a native view isn't possible, and I must be able to pragmatically force it to focus. Here's a very basic example replicating my issue. Non of the functionality works when a keyboard is connected: struct FocusableTestView: View { @FocusState private var isRectFocused: Bool var body: some View { VStack { // This text field should focus the custom input when pressing return: TextField("Enter text", text: .constant("")) .textFieldStyle(.roundedBorder) .onSubmit { isRectFocused = true } .onKeyPress(.return) { isRectFocused = true return .handled } // This custom "input" should focus itself when tapped: Rectangle() .fill(isRectFocused ? Color.accentColor : Color.gray.opacity(0.3)) .frame(width: 100, height: 100) .overlay( Text(isRectFocused ? "Focused" : "Tap me") ) .focusable(true, interactions: .edit) .focused($isRectFocused) .onTapGesture { isRectFocused = true print("Focused rectangle") } // The focus should be able to be controlled externally: Button("Toggle Focus") { isRectFocused.toggle() } .buttonStyle(.bordered) } .frame(maxWidth: .infinity, maxHeight: .infinity, alignment: .center) } }
Replies
1
Boosts
0
Views
193
Activity
Oct ’25
iOS Swift: run screen recording programmatically
Is it possible to start screen recording (through Control Center) without user prompt? I mean to ask user permission for the first time and after that to start and stop recording programmatically only? I need to record screen only for specific events.
Replies
5
Boosts
1
Views
5.9k
Activity
Oct ’25
[iOS 26] CLHeading's magneticHeading and trueHeading return travel direction instead of device orientation when user is in motion
Problem Description I am using CLLocationManager to obtain the device's compass heading (direction), and I have encountered an abnormal behavior: When the user is stationary: After calling startUpdatingHeading(), the CLHeading object returned in the locationManager(_:didUpdateHeading:) callback correctly reflects the device’s actual physical orientation (i.e., the direction the top of the device is pointing) in terms of magnetic north / true north, via the magneticHeading and trueHeading properties. When I rotate the device, the heading values change accordingly — this is the expected behavior. But when the user is in motion (e.g., driving a car): Even if I rotate the device, the values of magneticHeading and trueHeading no longer reflect the device’s actual orientation. Instead, they consistently return what appears to be the user's or vehicle's travel direction (forward direction). In other words, the compass behaves as if it is reporting the direction of motion rather than the device’s actual facing direction. Only after the user has completely stopped moving, does rotating the device again result in magneticHeading and trueHeading reflecting the actual device orientation as expected. However, on another device running iOS 16 (iPhone XR), this behavior does not occur — everything works normally. Expected Behavior I expect that regardless of whether the user is moving or not, the CLHeading values returned by CLLocationManager should always represent the physical orientation of the device itself (i.e., which direction the top of the device is pointing), as a standard compass should. Actual Behavior User is stationary, rotating the device: magneticHeading / trueHeading change properly according to the device’s actual orientation User is in motion (e.g., driving):magneticHeading / trueHeading remain fixed to the direction of motion (travel direction), and do not change when the device is rotated User stops moving, then rotates the device:Compass behaves normally again, reflecting the actual device orientation Environment Information iOS Version: iOS 26.0.1 Device Models: iPhone 15 Pro / iPhone 17 Pro Xcode Version: Xcode 26.0.1 Language: Objective-C Questions Is this a known issue in iOS? Are there any related radars or official documentation about it? Have other developers encountered similar issues, especially where CLHeading behaves incorrectly when the user is in motion? Do I need to set any specific parameters in CLLocationManager (such as headingOrientation) to resolve or work around this issue? 🙏 Thank you for your help — any insights, experiences, or official feedback regarding this issue would be greatly appreciated!
Replies
0
Boosts
0
Views
383
Activity
Oct ’25
Can't submit the test version
When I try to add the external test group to the current version of the test group, there is an error in processing the request.
Replies
3
Boosts
0
Views
289
Activity
Oct ’25
Metal recommendedMaxWorkingSetSize vs actual RAM on iPhone (LLM load fails)
Context I’m deploying large language models on iPhone using llama.cpp. A new iPhone Air (12 GB RAM) reports a Metal MTLDevice.recommendedMaxWorkingSetSize of 8,192 MB, and my attempt to load Llama-2-13B Q4_K (~7.32 GB weights) fails during model initialization. Environment Device: iPhone Air (12 GB RAM) iOS: 26 Xcode: 26.0.1 Build: Metal backend enabled llama.cpp App runs on device (not Simulator) What I’m seeing MTLCreateSystemDefaultDevice().recommendedMaxWorkingSetSize == 8192 MiB Loading Llama-2-13B Q4_K (7.32 GB) fails to complete. Logs indicate memory pressure / allocation issues consistent with the 8 GB working-set guidance. Smaller models (e.g., 7B/8B with similar quantization) load and run (8B Q4_K provide around 9 tokens/second decoding speed). Questions Is 8,192 MB an expected recommendedMaxWorkingSetSize on a 12 GB iPhone? What values should I expect on other 2025 devices including iPhone 17 (8 GB RAM) and iPhone 17 Pro (12 GB RAM) Is it strictly enforced by Metal allocations (heaps/buffers), or advisory for best performance/eviction behavior? Can a process practically exceed this for long-lived buffers without immediate Jetsam risk? Any guidance for LLM scenarios near the limit?
Replies
0
Boosts
0
Views
612
Activity
Oct ’25
Could not launch “App” (Xcode26.0 + iPhone6(iOS12))
Problem Details : Could not launch “App” Reproduction Route : Install Xcode26.0 > Connect to iPhone6(iOS12) > Run app We tried this solution but didn't work. To make Xcode 26 recognize and run apps on an iOS 12 physical device, you can manually add the missing device support files by going to /Applications/Xcode.app/Contents/Developer/Platforms/iPhoneOS.platform/DeviceSupport/ on your Mac, where you’ll see folders like 17.0 or 18.0; download the matching iOS 12 folder (for example, 12.4) from the community-maintained repository.
Replies
1
Boosts
0
Views
184
Activity
Oct ’25
The keyboard does not update as the device rotates
In our project, we are now using UIScene, and when click InputText to pull up Keyboard, and when rotate the device, the Keyboard does not update: how can we fix this issue?
Topic: UI Frameworks SubTopic: UIKit Tags:
Replies
1
Boosts
0
Views
126
Activity
Oct ’25
iOS App Exists after launch
Hello, my iOS apps are exiting right after launch on a few of our iOS devices. I tried a couple of my apps that are deployed to our fleet and they do the same thing. If I run the app(s) in the Simulator it works fine and if I run the app(s) on the offending devices it works fine as well. Once I stop the run in Xcode the app on the device will not launch. I'm thinking something is missing like a certificate etc. Just not sure. Any ideas on how to troubleshoot this? I would really like to get this fixed.
Replies
3
Boosts
0
Views
411
Activity
Oct ’25
Bluetooth connection unexpectedly timing out with macOS Sequoia
After the macOS Sequoia update, my app seems to have an issue with Bluetooth communication between macOS and iOS that uses CoreBluetooth for Central-Peripheral communication. Setup: The iPhone (in my case: iPhone 14 Pro with iOS 18.0 (22A3354)) acts as the Central, and the Mac (in my case: 14" MacBook Pro 2023 with macOS 15.0 (24A335)) as the Peripheral. I’ve implemented a mechanism where the Central (iPhone) sends a message to the Peripheral (Mac) every 15 seconds to keep the connection alive (Because it needs to wait for notify characteristic updates). I never noticed this kind of issue before, but with macOS Sequoia I get it permanently. Issue: The connection drops unexpectedly after a period of time (sometimes 20 seconds, sometimes a few minutes) with CBErrorDomain - code 6: The connection has timed out unexpectedly. Sample Code: Peripheral (Mac): ContentView (Peripheral).txt ContentViewModel (Peripheral).txt Central (iPhone): ContentView (Central).txt ContentViewModel (Central).txt Reproduce: I attached sample code including the Central-Sample (for iPhone) and Peripheral-Sample (for Mac). Just run the Peripheral-Sample (after granting Bluetooth permissions). Then run the Central-Sample and select the Mac device in the list After selecting it should connect, discover the service & characteristic and should start writing messages to it. After some time the func centralManager(_ central: CBCentralManager, didDisconnectPeripheral peripheral: CBPeripheral, error: (any Error)?) {should get called with timed out unexpectedly error. Could anyone please look into this issue and advise on whether there’s a known bug or any workaround? Any guidance would be greatly appreciated, as this impacts the stability of Bluetooth communication between the devices. Thanks in advance. Logs: I also ran the console.app during this issue which got these errors (if this is helpful): console_logs.txt
Replies
6
Boosts
4
Views
3.4k
Activity
Oct ’25
UIViewController memory leak with modal presentedViewController
Hi everyone, I'm encountering an unexpected behavior with modal presentations in UIKit. Here’s what happens: I have UIViewControllerA (let’s call it the "orange" VC) pushed onto a UINavigationController stack. I present UIViewControllerB (the "red" VC, inside its own UINavigationController as a .formSheet) modally over UIViewControllerA. After a short delay, I pop UIViewControllerA from the navigation stack. Issue: After popping UIViewControllerA, the modal UIViewControllerB remains visible on the screen and in memory. I expected that dismissing (popping) the presenting view controller would also dismiss the modal, but it stays. Expected Behavior: When UIViewControllerA (orange) is popped, I expect the modal UIViewControllerB (red) to be dismissed as well. Actual Behavior: The modal UIViewControllerB remains on screen and is not dismissed, even though its presenting view controller has been removed from the navigation stack. Video example: https://youtube.com/shorts/sttbd6p_r_c Question: Is this the expected behavior? If so, what is the recommended way to ensure that the modal is dismissed when its presenting view controller is removed from the navigation stack? Code snippet: class MainVC: UIViewController { private weak var orangeVC: UIViewController? override func viewDidLoad() { super.viewDidLoad() self.view.backgroundColor = .blue let dq = DispatchQueue.main dq.asyncAfter(deadline: .now() + 1) { [weak self] in let vc1 = UIViewController() vc1.view.backgroundColor = .orange vc1.modalPresentationStyle = .overCurrentContext self?.navigationController?.pushViewController(vc1, animated: true) self?.orangeVC = vc1 dq.asyncAfter(deadline: .now() + 1) { [weak self] in let vc2 = UIViewController() vc2.view.backgroundColor = .red vc2.modalPresentationStyle = .formSheet vc2.isModalInPresentation = true let nav = UINavigationController(rootViewController: vc2) if let sheet = nav.sheetPresentationController { sheet.detents = [.medium()] } self?.orangeVC?.present(nav, animated: true) dq.asyncAfter(deadline: .now() + 1) { [weak self] in self?.navigationController?.popViewController(animated: true) } } } } } Thank you for your help!
Replies
0
Boosts
0
Views
146
Activity
Oct ’25
Clarification Needed: Using action=write-review outside of the app
Hello, is it allowed to include the action=write-review URL parameter in customer support emails to direct users to the App Store review page? Example: https://apps.apple.com/app/id[APP_ID]?action=write-review I want to make it easy for customers to leave feedback after positive support interactions, but only if it's compliant.
Replies
0
Boosts
0
Views
132
Activity
Oct ’25
application(_:didFinishLaunchingWithOptions:) launchOptions is nil when app supports scenes
our app support silent push, and we use below code to get if app is launched by silent push: if let remoteNotification = launchOptions?[UIApplication.LaunchOptionsKey.remoteNotification] as? [AnyHashable: Any], let aps = remoteNotification["aps"] as? [AnyHashable: Any], let contentAvailable = aps["content-available"] as? Int, contentAvailable == 1 { isSilentNotification = true } when app is launch and call: application(_:didFinishLaunchingWithOptions:) but when migrate to UIScene, the launchOptions is always nil, and we can not get to know if app is launched by silent push; I read the develop doc: it says: If the app supports scenes, this is nil. Continue using UIApplicationDelegate's application(_:didReceiveRemoteNotification:fetchCompletionHandler:) to process silent remote notifications after scene connection. but the time for method calling application(_:didReceiveRemoteNotification:fetchCompletionHandler:) id too late. So except in didReceiveRemoteNotification method calling: application(_:didReceiveRemoteNotification:fetchCompletionHandler:) are there any other ways to obtain the silent push flag when app is launch and has not didReceiveRemoteNotification call back.
Replies
1
Boosts
0
Views
144
Activity
Oct ’25
iOS 26.1 Beta 4 MDM Camera Restriction Bypassed by Third-Party Apps
In iOS 26.1 beta 4, under MDM restrictions that disable the camera via a configuration profile, the Camera and FaceTime apps are hidden as expected. However, other third-party apps can still access and use the camera function normally. This is unreasonable.
Replies
1
Boosts
1
Views
796
Activity
Oct ’25
AVSpeechSynthesisVoice ignores user-selected voices in iOS 26 (Regression)
We've identified a regression in iOS 26.0 and 26.1 Beta 4 where AVSpeechSynthesisVoice(language:) no longer respects user-selected voices from Accessibility settings. Issue: When users select a specific voice in Settings → Accessibility → Spoken Content → Voices, calling AVSpeechSynthesisVoice(language:) returns the system default voice instead of the user's selection. This worked correctly in iOS 18.6.2. Particularly affects: Third-party speech synthesis voices (CereProc, Grammatek, etc.) Apps relying on automatic voice selection based on user preferences Example: // User selected CereProc Heather for en-GB in Accessibility settings let voice = AVSpeechSynthesisVoice(language: "en-GB") print(voice?.name) // iOS 18.6.2: "HEATHER", iOS 26: "Daniel" (system default) Interesting observation: The new Accessibility Reader feature in iOS 26 correctly uses the user-selected voice, but Tap to Speak and the API both ignore the setting. Tested methods: AVSpeechSynthesisVoice(language:) AVSpeechUtterance auto-selection Reflection for new APIs All return the system default voice, not the user's preference. Filed: FB[20271264] Has anyone else encountered this? Any known workarounds to programmatically access the user's preferred voice selection?
Replies
4
Boosts
1
Views
564
Activity
Oct ’25
In iOS26, UITableview can't display cell,but cells can be selected!
We have received several cases that our app can not display uitableview cell in iOS26, but users said that they can select cells with single tab and the uitableview didselectcell delegate can response! I have reported a feedback but no response. Does anyone have the same bugs with me? You guys can see that the page is blank, I have a video a user sent to me can proved that he can select cell with gesture. We cannot reproduce the bug and don't konw how to fixed, we think this is the bug with iOS26, so here for some help. This bug block our distribution of new version(support iOS26) This is the feedback https://feedbackassistant.apple.com/feedback/20677046
Replies
1
Boosts
0
Views
199
Activity
Oct ’25
How to hide supplementary column alone in three column split view
We are using three column split view as root of our app and wants to hide the supplementary column alone in some cases and behaves like two column split view. With the existing apis we are unable to achieve this since it hides primary column as well and not giving expected results. .hide(.supplementary) setViewController(nil, for: .supplementary) But seen this behavior in Native Notes app when using the View as List and Gallery option. is there any way to achieve this with maintaining three column split view itself ?
Replies
1
Boosts
0
Views
181
Activity
Oct ’25
Displaying and working with Favorites in iOS app
New to iOS development and I've been trying to make heads or tails of the documentation. I know there is a difference between the data fields returned from songs from the user library and from the category, but whenever I search on the apple site I can't find a list of each. For example, Im trying to get the releaseDate of a song in my library, but it seems I'll have to cross-query either the catalog entry for the using song.catalogID or the song.irsc but when I try to use them I can't find a cross reference between the two. I'm totally turned around. Also trying to determine if a song in my library has been favorited or not? isFavorited (or something similar) doesn't seem to be a thing. Using this code and trying to find a way to display a solid star if the song has been favorited or an empty one if it's not. Seems like a basic request but I can't find anything on how to do it. I've searched docs, googled, tried. Does apple want us to query the user's Favorited Songs playlist or something? How do I know which playlist that is? I know isFavorited isn't a thing, just using it here so you can see what my intension is: HStack(spacing: 10) { Image(systemName: song.isFavorited ? "star.fill" : "star") .foregroundColor(song.isFavorited ? .yellow : .gray) Image(systemName: "magnifyingglass") }
Replies
1
Boosts
0
Views
228
Activity
Oct ’25