iOS is the operating system for iPhone.

Posts under iOS tag

200 Posts

Post

Replies

Boosts

Views

Activity

Why are system reserved files consuming half of my storage?
I am constantly running out of storage on my iPhone 16 Pro. I keep having to move my photos and videos to my laptop and delete them from my phone, and I’m constantly needing to offload apps and manually clear caches in some apps to free up storage. I finally got sick of having this cycle every two weeks so looked into it more closely. I’m finding that iOS consumes 32 GB, and then another system reserve category is consuming an additional 23 GB. Meaning the system reserved files are consuming half of the storage on this phone and effectively making it a 64 GB model. I understand the system will need to consume some capacity for itself and that iOS is getting larger, but nearly 50% of the capacity of the phone is insane. Looking closer into the categories, I’m seeing that iOS has taken it upon itself to also permanently provision 10% of the storage capacity for reserve update space. Already another instance of “why am I having to lose so much of my functional capacity to an occasional process?” but I can understand the utility of this — if I didn’t still have to offload basically all my apps every single time I run a software update, because I’m still some not-insignificant amount short. I seem to recall it being between 6-20 GB across the different updates I’ve had to do since iOS 26 rolled around. I’d also like to be clear that preprovisioning the storage space for updates isn’t a bad idea, just give us an off switch if we’d rather be able to take a few hundred more photos, have another few apps, etc. than have the space sit mostly unused. The biggest culprit is this “system data” category which is somehow consuming as much space as the entire operating system and its extensions. There’s no clear way to request iOS to clear this down if some of it is temporary data, which we should have a button for even if Apple thinks it should “just work.” Windows usually trims down on its temp files, but on the occasion you go look and see 67 GB of temporary files, being able to manually run the disk cleanup tool is very helpful. I’m hesitant to try any third party app because I shouldn’t need to, and knowing Apple, it wouldn’t have access to anything it would actually have to touch anyway. Which is neither here nor there, but give us a button to clear cache or maybe run the cleanup when the phone reboots? I am running the developer beta right now so maybe that’s part of it. However I’m not sure… I had switched to mainline release for a while when it released, and it didn’t seem any different with storage consumption and battery drain. I jumped back to beta to see some of the new features and am waiting for another mainline release to switch back to as the recent betas have been much more unstable/buggy than the entire prerelease beta period. Just wondering if anyone has any kind of input on this storage issue in particular as it’s not really been talked about as much as the battery drain issue from what I can see.
5
0
311
Oct ’25
iOS App Crashes after install but not when running from Xcode
I have an odd issue I'm trying to troubleshoot. I have an app that is deployed to our enterprise and works on almost all of our devices. Lately, I have devices (very small number) where the app installs from our MDM and it crashes upon launch, it does not get past the launch screen. If I remove and reinstall the app from the MDM it still crashes. I decided to put the devices in developer mode and if I run the App from Xcode on the device it works fine. If I stop it and run it right from the device it also works. If I do a final build of the app and install it using "Devices and Simulators", the app crashes upon launch. Using "Devices and Simulators" I check for crash log for the app and no file. I am stumped as to whats going on.
7
0
165
Oct ’25
Unable to write to file system when building for My Mac (Designed for iPad)
Our app is unable to write to its own sandbox container on macOS when run via “My Mac (Designed for iPad)”. This is not an issue when the app runs on iPhone or on iPad. This seems to affect all attempts to write to the file system including: UserDefaults Core Data (SQLite) Firebase (Analytics, Crashlytics, Sessions) File creation (PDFs, temp files, etc.) We're seeing the following errors in the console: Operation not permitted / NSCocoaErrorDomain Code=513: Permissions error when writing to disk. CFPrefsPlistSource: Path not accessible: Failure to write to UserDefaults. Cannot synchronize user defaults to disk: UserDefaults write blocked. CoreData: No permissions to create file: Core Data SQLite store can't be created. Firebase: Failed to open database: Firebase can't initialize local storage. CGDataConsumerCreateWithFilename: failed to open ... for writing: PDF generation fails due to temp directory access issues. Created a test project to try and reproduce the issue but unable to do so in the test project, even when setting all the build settings the same as the project having issues.
2
0
152
Oct ’25
Navigation Bar Occupies Too Much Space in iOS 26 Landscape Orientation
I’m really frustrated with iOS 26. It was supposed to make better use of screen space, but when you combine the navigation bar, tab bar, and search bar, they eat up way too much room. Apple actually did a great job with the new tab bar — it’s smaller, smooth, and looks great when expanding or collapsing while scrolling. The way the search bar appears above the keyboard is also really nice. But why did they keep the navigation bar the same height in both portrait and landscape? In landscape it takes up too much space and just looks bad. It was way better in iOS 18.
2
0
178
Oct ’25
Focusable doesn't work on iPad with external keyboard
I have a custom input view in my app which is .focusable(). It behaves similar to a TextField, where it must be focused in order to be used. This works fine on all platforms including iPad, except when when an external keyboard is connected (magic keyboard), in which case it can't be focused anymore and becomes unusable. Is there a solution to this, or a workaround? My view is very complex, so simple solutions like replacing it with a native view isn't possible, and I must be able to pragmatically force it to focus. Here's a very basic example replicating my issue. Non of the functionality works when a keyboard is connected: struct FocusableTestView: View { @FocusState private var isRectFocused: Bool var body: some View { VStack { // This text field should focus the custom input when pressing return: TextField("Enter text", text: .constant("")) .textFieldStyle(.roundedBorder) .onSubmit { isRectFocused = true } .onKeyPress(.return) { isRectFocused = true return .handled } // This custom "input" should focus itself when tapped: Rectangle() .fill(isRectFocused ? Color.accentColor : Color.gray.opacity(0.3)) .frame(width: 100, height: 100) .overlay( Text(isRectFocused ? "Focused" : "Tap me") ) .focusable(true, interactions: .edit) .focused($isRectFocused) .onTapGesture { isRectFocused = true print("Focused rectangle") } // The focus should be able to be controlled externally: Button("Toggle Focus") { isRectFocused.toggle() } .buttonStyle(.bordered) } .frame(maxWidth: .infinity, maxHeight: .infinity, alignment: .center) } }
1
0
151
Oct ’25
[iOS 26] CLHeading's magneticHeading and trueHeading return travel direction instead of device orientation when user is in motion
Problem Description I am using CLLocationManager to obtain the device's compass heading (direction), and I have encountered an abnormal behavior: When the user is stationary: After calling startUpdatingHeading(), the CLHeading object returned in the locationManager(_:didUpdateHeading:) callback correctly reflects the device’s actual physical orientation (i.e., the direction the top of the device is pointing) in terms of magnetic north / true north, via the magneticHeading and trueHeading properties. When I rotate the device, the heading values change accordingly — this is the expected behavior. But when the user is in motion (e.g., driving a car): Even if I rotate the device, the values of magneticHeading and trueHeading no longer reflect the device’s actual orientation. Instead, they consistently return what appears to be the user's or vehicle's travel direction (forward direction). In other words, the compass behaves as if it is reporting the direction of motion rather than the device’s actual facing direction. Only after the user has completely stopped moving, does rotating the device again result in magneticHeading and trueHeading reflecting the actual device orientation as expected. However, on another device running iOS 16 (iPhone XR), this behavior does not occur — everything works normally. Expected Behavior I expect that regardless of whether the user is moving or not, the CLHeading values returned by CLLocationManager should always represent the physical orientation of the device itself (i.e., which direction the top of the device is pointing), as a standard compass should. Actual Behavior User is stationary, rotating the device: magneticHeading / trueHeading change properly according to the device’s actual orientation User is in motion (e.g., driving):magneticHeading / trueHeading remain fixed to the direction of motion (travel direction), and do not change when the device is rotated User stops moving, then rotates the device:Compass behaves normally again, reflecting the actual device orientation Environment Information iOS Version: iOS 26.0.1 Device Models: iPhone 15 Pro / iPhone 17 Pro Xcode Version: Xcode 26.0.1 Language: Objective-C Questions Is this a known issue in iOS? Are there any related radars or official documentation about it? Have other developers encountered similar issues, especially where CLHeading behaves incorrectly when the user is in motion? Do I need to set any specific parameters in CLLocationManager (such as headingOrientation) to resolve or work around this issue? 🙏 Thank you for your help — any insights, experiences, or official feedback regarding this issue would be greatly appreciated!
0
0
290
Oct ’25
Metal recommendedMaxWorkingSetSize vs actual RAM on iPhone (LLM load fails)
Context I’m deploying large language models on iPhone using llama.cpp. A new iPhone Air (12 GB RAM) reports a Metal MTLDevice.recommendedMaxWorkingSetSize of 8,192 MB, and my attempt to load Llama-2-13B Q4_K (~7.32 GB weights) fails during model initialization. Environment Device: iPhone Air (12 GB RAM) iOS: 26 Xcode: 26.0.1 Build: Metal backend enabled llama.cpp App runs on device (not Simulator) What I’m seeing MTLCreateSystemDefaultDevice().recommendedMaxWorkingSetSize == 8192 MiB Loading Llama-2-13B Q4_K (7.32 GB) fails to complete. Logs indicate memory pressure / allocation issues consistent with the 8 GB working-set guidance. Smaller models (e.g., 7B/8B with similar quantization) load and run (8B Q4_K provide around 9 tokens/second decoding speed). Questions Is 8,192 MB an expected recommendedMaxWorkingSetSize on a 12 GB iPhone? What values should I expect on other 2025 devices including iPhone 17 (8 GB RAM) and iPhone 17 Pro (12 GB RAM) Is it strictly enforced by Metal allocations (heaps/buffers), or advisory for best performance/eviction behavior? Can a process practically exceed this for long-lived buffers without immediate Jetsam risk? Any guidance for LLM scenarios near the limit?
0
0
418
Oct ’25
Could not launch “App” (Xcode26.0 + iPhone6(iOS12))
Problem Details : Could not launch “App” Reproduction Route : Install Xcode26.0 > Connect to iPhone6(iOS12) > Run app We tried this solution but didn't work. To make Xcode 26 recognize and run apps on an iOS 12 physical device, you can manually add the missing device support files by going to /Applications/Xcode.app/Contents/Developer/Platforms/iPhoneOS.platform/DeviceSupport/ on your Mac, where you’ll see folders like 17.0 or 18.0; download the matching iOS 12 folder (for example, 12.4) from the community-maintained repository.
1
0
129
Oct ’25
iOS App Exists after launch
Hello, my iOS apps are exiting right after launch on a few of our iOS devices. I tried a couple of my apps that are deployed to our fleet and they do the same thing. If I run the app(s) in the Simulator it works fine and if I run the app(s) on the offending devices it works fine as well. Once I stop the run in Xcode the app on the device will not launch. I'm thinking something is missing like a certificate etc. Just not sure. Any ideas on how to troubleshoot this? I would really like to get this fixed.
3
0
381
Oct ’25
Bluetooth connection unexpectedly timing out with macOS Sequoia
After the macOS Sequoia update, my app seems to have an issue with Bluetooth communication between macOS and iOS that uses CoreBluetooth for Central-Peripheral communication. Setup: The iPhone (in my case: iPhone 14 Pro with iOS 18.0 (22A3354)) acts as the Central, and the Mac (in my case: 14" MacBook Pro 2023 with macOS 15.0 (24A335)) as the Peripheral. I’ve implemented a mechanism where the Central (iPhone) sends a message to the Peripheral (Mac) every 15 seconds to keep the connection alive (Because it needs to wait for notify characteristic updates). I never noticed this kind of issue before, but with macOS Sequoia I get it permanently. Issue: The connection drops unexpectedly after a period of time (sometimes 20 seconds, sometimes a few minutes) with CBErrorDomain - code 6: The connection has timed out unexpectedly. Sample Code: Peripheral (Mac): ContentView (Peripheral).txt ContentViewModel (Peripheral).txt Central (iPhone): ContentView (Central).txt ContentViewModel (Central).txt Reproduce: I attached sample code including the Central-Sample (for iPhone) and Peripheral-Sample (for Mac). Just run the Peripheral-Sample (after granting Bluetooth permissions). Then run the Central-Sample and select the Mac device in the list After selecting it should connect, discover the service & characteristic and should start writing messages to it. After some time the func centralManager(_ central: CBCentralManager, didDisconnectPeripheral peripheral: CBPeripheral, error: (any Error)?) {should get called with timed out unexpectedly error. Could anyone please look into this issue and advise on whether there’s a known bug or any workaround? Any guidance would be greatly appreciated, as this impacts the stability of Bluetooth communication between the devices. Thanks in advance. Logs: I also ran the console.app during this issue which got these errors (if this is helpful): console_logs.txt
6
4
3.0k
Oct ’25
UIViewController memory leak with modal presentedViewController
Hi everyone, I'm encountering an unexpected behavior with modal presentations in UIKit. Here’s what happens: I have UIViewControllerA (let’s call it the "orange" VC) pushed onto a UINavigationController stack. I present UIViewControllerB (the "red" VC, inside its own UINavigationController as a .formSheet) modally over UIViewControllerA. After a short delay, I pop UIViewControllerA from the navigation stack. Issue: After popping UIViewControllerA, the modal UIViewControllerB remains visible on the screen and in memory. I expected that dismissing (popping) the presenting view controller would also dismiss the modal, but it stays. Expected Behavior: When UIViewControllerA (orange) is popped, I expect the modal UIViewControllerB (red) to be dismissed as well. Actual Behavior: The modal UIViewControllerB remains on screen and is not dismissed, even though its presenting view controller has been removed from the navigation stack. Video example: https://youtube.com/shorts/sttbd6p_r_c Question: Is this the expected behavior? If so, what is the recommended way to ensure that the modal is dismissed when its presenting view controller is removed from the navigation stack? Code snippet: class MainVC: UIViewController { private weak var orangeVC: UIViewController? override func viewDidLoad() { super.viewDidLoad() self.view.backgroundColor = .blue let dq = DispatchQueue.main dq.asyncAfter(deadline: .now() + 1) { [weak self] in let vc1 = UIViewController() vc1.view.backgroundColor = .orange vc1.modalPresentationStyle = .overCurrentContext self?.navigationController?.pushViewController(vc1, animated: true) self?.orangeVC = vc1 dq.asyncAfter(deadline: .now() + 1) { [weak self] in let vc2 = UIViewController() vc2.view.backgroundColor = .red vc2.modalPresentationStyle = .formSheet vc2.isModalInPresentation = true let nav = UINavigationController(rootViewController: vc2) if let sheet = nav.sheetPresentationController { sheet.detents = [.medium()] } self?.orangeVC?.present(nav, animated: true) dq.asyncAfter(deadline: .now() + 1) { [weak self] in self?.navigationController?.popViewController(animated: true) } } } } } Thank you for your help!
0
0
105
Oct ’25
application(_:didFinishLaunchingWithOptions:) launchOptions is nil when app supports scenes
our app support silent push, and we use below code to get if app is launched by silent push: if let remoteNotification = launchOptions?[UIApplication.LaunchOptionsKey.remoteNotification] as? [AnyHashable: Any], let aps = remoteNotification["aps"] as? [AnyHashable: Any], let contentAvailable = aps["content-available"] as? Int, contentAvailable == 1 { isSilentNotification = true } when app is launch and call: application(_:didFinishLaunchingWithOptions:) but when migrate to UIScene, the launchOptions is always nil, and we can not get to know if app is launched by silent push; I read the develop doc: it says: If the app supports scenes, this is nil. Continue using UIApplicationDelegate's application(_:didReceiveRemoteNotification:fetchCompletionHandler:) to process silent remote notifications after scene connection. but the time for method calling application(_:didReceiveRemoteNotification:fetchCompletionHandler:) id too late. So except in didReceiveRemoteNotification method calling: application(_:didReceiveRemoteNotification:fetchCompletionHandler:) are there any other ways to obtain the silent push flag when app is launch and has not didReceiveRemoteNotification call back.
1
0
111
Oct ’25
AVSpeechSynthesisVoice ignores user-selected voices in iOS 26 (Regression)
We've identified a regression in iOS 26.0 and 26.1 Beta 4 where AVSpeechSynthesisVoice(language:) no longer respects user-selected voices from Accessibility settings. Issue: When users select a specific voice in Settings → Accessibility → Spoken Content → Voices, calling AVSpeechSynthesisVoice(language:) returns the system default voice instead of the user's selection. This worked correctly in iOS 18.6.2. Particularly affects: Third-party speech synthesis voices (CereProc, Grammatek, etc.) Apps relying on automatic voice selection based on user preferences Example: // User selected CereProc Heather for en-GB in Accessibility settings let voice = AVSpeechSynthesisVoice(language: "en-GB") print(voice?.name) // iOS 18.6.2: "HEATHER", iOS 26: "Daniel" (system default) Interesting observation: The new Accessibility Reader feature in iOS 26 correctly uses the user-selected voice, but Tap to Speak and the API both ignore the setting. Tested methods: AVSpeechSynthesisVoice(language:) AVSpeechUtterance auto-selection Reflection for new APIs All return the system default voice, not the user's preference. Filed: FB[20271264] Has anyone else encountered this? Any known workarounds to programmatically access the user's preferred voice selection?
4
1
371
Oct ’25
In iOS26, UITableview can't display cell,but cells can be selected!
We have received several cases that our app can not display uitableview cell in iOS26, but users said that they can select cells with single tab and the uitableview didselectcell delegate can response! I have reported a feedback but no response. Does anyone have the same bugs with me? You guys can see that the page is blank, I have a video a user sent to me can proved that he can select cell with gesture. We cannot reproduce the bug and don't konw how to fixed, we think this is the bug with iOS26, so here for some help. This bug block our distribution of new version(support iOS26) This is the feedback https://feedbackassistant.apple.com/feedback/20677046
1
0
137
Oct ’25
How to hide supplementary column alone in three column split view
We are using three column split view as root of our app and wants to hide the supplementary column alone in some cases and behaves like two column split view. With the existing apis we are unable to achieve this since it hides primary column as well and not giving expected results. .hide(.supplementary) setViewController(nil, for: .supplementary) But seen this behavior in Native Notes app when using the View as List and Gallery option. is there any way to achieve this with maintaining three column split view itself ?
1
0
138
Oct ’25