iOS is the operating system for iPhone.

All subtopics
Posts under iOS topic

Post

Replies

Boosts

Views

Created

Focusable doesn't work on iPad with external keyboard
I have a custom input view in my app which is .focusable(). It behaves similar to a TextField, where it must be focused in order to be used. This works fine on all platforms including iPad, except when when an external keyboard is connected (magic keyboard), in which case it can't be focused anymore and becomes unusable. Is there a solution to this, or a workaround? My view is very complex, so simple solutions like replacing it with a native view isn't possible, and I must be able to pragmatically force it to focus. Here's a very basic example replicating my issue. Non of the functionality works when a keyboard is connected: struct FocusableTestView: View { @FocusState private var isRectFocused: Bool var body: some View { VStack { // This text field should focus the custom input when pressing return: TextField("Enter text", text: .constant("")) .textFieldStyle(.roundedBorder) .onSubmit { isRectFocused = true } .onKeyPress(.return) { isRectFocused = true return .handled } // This custom "input" should focus itself when tapped: Rectangle() .fill(isRectFocused ? Color.accentColor : Color.gray.opacity(0.3)) .frame(width: 100, height: 100) .overlay( Text(isRectFocused ? "Focused" : "Tap me") ) .focusable(true, interactions: .edit) .focused($isRectFocused) .onTapGesture { isRectFocused = true print("Focused rectangle") } // The focus should be able to be controlled externally: Button("Toggle Focus") { isRectFocused.toggle() } .buttonStyle(.bordered) } .frame(maxWidth: .infinity, maxHeight: .infinity, alignment: .center) } }
1
0
175
Oct ’25
Unable to write to file system when building for My Mac (Designed for iPad)
Our app is unable to write to its own sandbox container on macOS when run via “My Mac (Designed for iPad)”. This is not an issue when the app runs on iPhone or on iPad. This seems to affect all attempts to write to the file system including: UserDefaults Core Data (SQLite) Firebase (Analytics, Crashlytics, Sessions) File creation (PDFs, temp files, etc.) We're seeing the following errors in the console: Operation not permitted / NSCocoaErrorDomain Code=513: Permissions error when writing to disk. CFPrefsPlistSource: Path not accessible: Failure to write to UserDefaults. Cannot synchronize user defaults to disk: UserDefaults write blocked. CoreData: No permissions to create file: Core Data SQLite store can't be created. Firebase: Failed to open database: Firebase can't initialize local storage. CGDataConsumerCreateWithFilename: failed to open ... for writing: PDF generation fails due to temp directory access issues. Created a test project to try and reproduce the issue but unable to do so in the test project, even when setting all the build settings the same as the project having issues.
2
0
196
Oct ’25
Could not launch “App” (Xcode26.0 + iPhone6(iOS12))
Problem Details : Could not launch “App” Reproduction Route : Install Xcode26.0 > Connect to iPhone6(iOS12) > Run app We tried this solution but didn't work. To make Xcode 26 recognize and run apps on an iOS 12 physical device, you can manually add the missing device support files by going to /Applications/Xcode.app/Contents/Developer/Platforms/iPhoneOS.platform/DeviceSupport/ on your Mac, where you’ll see folders like 17.0 or 18.0; download the matching iOS 12 folder (for example, 12.4) from the community-maintained repository.
1
0
164
Oct ’25
UIViewController memory leak with modal presentedViewController
Hi everyone, I'm encountering an unexpected behavior with modal presentations in UIKit. Here’s what happens: I have UIViewControllerA (let’s call it the "orange" VC) pushed onto a UINavigationController stack. I present UIViewControllerB (the "red" VC, inside its own UINavigationController as a .formSheet) modally over UIViewControllerA. After a short delay, I pop UIViewControllerA from the navigation stack. Issue: After popping UIViewControllerA, the modal UIViewControllerB remains visible on the screen and in memory. I expected that dismissing (popping) the presenting view controller would also dismiss the modal, but it stays. Expected Behavior: When UIViewControllerA (orange) is popped, I expect the modal UIViewControllerB (red) to be dismissed as well. Actual Behavior: The modal UIViewControllerB remains on screen and is not dismissed, even though its presenting view controller has been removed from the navigation stack. Video example: https://youtube.com/shorts/sttbd6p_r_c Question: Is this the expected behavior? If so, what is the recommended way to ensure that the modal is dismissed when its presenting view controller is removed from the navigation stack? Code snippet: class MainVC: UIViewController { private weak var orangeVC: UIViewController? override func viewDidLoad() { super.viewDidLoad() self.view.backgroundColor = .blue let dq = DispatchQueue.main dq.asyncAfter(deadline: .now() + 1) { [weak self] in let vc1 = UIViewController() vc1.view.backgroundColor = .orange vc1.modalPresentationStyle = .overCurrentContext self?.navigationController?.pushViewController(vc1, animated: true) self?.orangeVC = vc1 dq.asyncAfter(deadline: .now() + 1) { [weak self] in let vc2 = UIViewController() vc2.view.backgroundColor = .red vc2.modalPresentationStyle = .formSheet vc2.isModalInPresentation = true let nav = UINavigationController(rootViewController: vc2) if let sheet = nav.sheetPresentationController { sheet.detents = [.medium()] } self?.orangeVC?.present(nav, animated: true) dq.asyncAfter(deadline: .now() + 1) { [weak self] in self?.navigationController?.popViewController(animated: true) } } } } } Thank you for your help!
0
0
120
Oct ’25
application(_:didFinishLaunchingWithOptions:) launchOptions is nil when app supports scenes
our app support silent push, and we use below code to get if app is launched by silent push: if let remoteNotification = launchOptions?[UIApplication.LaunchOptionsKey.remoteNotification] as? [AnyHashable: Any], let aps = remoteNotification["aps"] as? [AnyHashable: Any], let contentAvailable = aps["content-available"] as? Int, contentAvailable == 1 { isSilentNotification = true } when app is launch and call: application(_:didFinishLaunchingWithOptions:) but when migrate to UIScene, the launchOptions is always nil, and we can not get to know if app is launched by silent push; I read the develop doc: it says: If the app supports scenes, this is nil. Continue using UIApplicationDelegate's application(_:didReceiveRemoteNotification:fetchCompletionHandler:) to process silent remote notifications after scene connection. but the time for method calling application(_:didReceiveRemoteNotification:fetchCompletionHandler:) id too late. So except in didReceiveRemoteNotification method calling: application(_:didReceiveRemoteNotification:fetchCompletionHandler:) are there any other ways to obtain the silent push flag when app is launch and has not didReceiveRemoteNotification call back.
1
0
128
Oct ’25
iOS App Exists after launch
Hello, my iOS apps are exiting right after launch on a few of our iOS devices. I tried a couple of my apps that are deployed to our fleet and they do the same thing. If I run the app(s) in the Simulator it works fine and if I run the app(s) on the offending devices it works fine as well. Once I stop the run in Xcode the app on the device will not launch. I'm thinking something is missing like a certificate etc. Just not sure. Any ideas on how to troubleshoot this? I would really like to get this fixed.
3
0
399
Oct ’25
Displaying and working with Favorites in iOS app
New to iOS development and I've been trying to make heads or tails of the documentation. I know there is a difference between the data fields returned from songs from the user library and from the category, but whenever I search on the apple site I can't find a list of each. For example, Im trying to get the releaseDate of a song in my library, but it seems I'll have to cross-query either the catalog entry for the using song.catalogID or the song.irsc but when I try to use them I can't find a cross reference between the two. I'm totally turned around. Also trying to determine if a song in my library has been favorited or not? isFavorited (or something similar) doesn't seem to be a thing. Using this code and trying to find a way to display a solid star if the song has been favorited or an empty one if it's not. Seems like a basic request but I can't find anything on how to do it. I've searched docs, googled, tried. Does apple want us to query the user's Favorited Songs playlist or something? How do I know which playlist that is? I know isFavorited isn't a thing, just using it here so you can see what my intension is: HStack(spacing: 10) { Image(systemName: song.isFavorited ? "star.fill" : "star") .foregroundColor(song.isFavorited ? .yellow : .gray) Image(systemName: "magnifyingglass") }
1
0
198
Oct ’25
How to keep API requests running in background using URLSession in Swift?
I'm developing an iOS application in Swift that performs API calls using URLSession.shared. The requests work correctly when the app is in the foreground. However, when the app transitions to the background (for example, when the user switches to another app), the ongoing API calls are either paused or do not complete as expected. What I’ve tried: Using URLSession.shared.dataTask(with:) to initiate the API requests Observing application lifecycle events like applicationDidEnterBackground, but haven't found a reliable solution to allow requests to complete when backgrounded Goal: I want certain API requests to continue running or be allowed to complete even if the app enters the background. Question: What is the correct approach to allow API calls to continue running or complete when the app moves to the background? Should I be using a background URLSessionConfiguration instead of URLSession.shared? If so, how should it be properly configured and used in this scenario?
1
0
131
Oct ’25
Incorrect padding in TextField with .rightToLeft layout direction
When a TextField is set to a rightToLeft layout, it gets strange and unnecessary padding on the left side. This pushes the text away from the edge. This issue doesn't occur in the leftToRight layout, where the text aligns correctly. Does anyone know how to get rid of this extra padding? Environment: Xcode version: 26.0.1 Device: iPhone 13 iOS version: 26.0 Code: struct ContentView: View { @State var textInput: String = "" var body: some View { VStack { Text("rightToLeft") TextField("placeholder", text: $textInput) .background(Color.red) .environment(\.layoutDirection, .rightToLeft) Text("leftToRight") TextField("placeholder", text: $textInput) .background(Color.red) .environment(\.layoutDirection, .leftToRight) } .padding() } }
1
1
230
Oct ’25
AVSpeechSynthesisVoice ignores user-selected voices in iOS 26 (Regression)
We've identified a regression in iOS 26.0 and 26.1 Beta 4 where AVSpeechSynthesisVoice(language:) no longer respects user-selected voices from Accessibility settings. Issue: When users select a specific voice in Settings → Accessibility → Spoken Content → Voices, calling AVSpeechSynthesisVoice(language:) returns the system default voice instead of the user's selection. This worked correctly in iOS 18.6.2. Particularly affects: Third-party speech synthesis voices (CereProc, Grammatek, etc.) Apps relying on automatic voice selection based on user preferences Example: // User selected CereProc Heather for en-GB in Accessibility settings let voice = AVSpeechSynthesisVoice(language: "en-GB") print(voice?.name) // iOS 18.6.2: "HEATHER", iOS 26: "Daniel" (system default) Interesting observation: The new Accessibility Reader feature in iOS 26 correctly uses the user-selected voice, but Tap to Speak and the API both ignore the setting. Tested methods: AVSpeechSynthesisVoice(language:) AVSpeechUtterance auto-selection Reflection for new APIs All return the system default voice, not the user's preference. Filed: FB[20271264] Has anyone else encountered this? Any known workarounds to programmatically access the user's preferred voice selection?
4
1
446
Oct ’25
How to protect endpoints used by Message Filtering Extension?
Hi, I am just wondering if there is any option to protect my endpoints that will be used by Message Filtering Extension? According to the documentation our API has 2 endpoints: /.well-known/apple-app-site-association /[endpoint setup in the ILMessageFilterExtensionNetworkURL value of the Info.plist file] that the deferQueryRequestToNetwork will request on every message Since all requests to these 2 endpoints are made by iOS itself (deferQueryRequestToNetwork), I don't understand how I can protect these endpoints on my side, like API key, or maybe mTLS. The only way that I found is white list for Apple IP range. Is there other methods for it?
1
0
147
Oct ’25
iOS 26 Keyboard Toolbar with Menu Buttons: Menu appears above hidden toolbar (instead of taking up it's space)
Hi Community! I'm wrangling with this for a few days now and can't find a way to fix it, but I see it working in other apps. I have a regular UIToolbar with buttons, assigning it as inputAccessoryView to a UITextView, so the toolbar appears above the keyboard (I can reproduce this also in SwiftUI). The toolbar contains regular UIBarButtonItems, except one doesn't have an action, it has a UIMenu (using .menu = UIMenu...). In the Apple Notes app, there is also one button that shows a menu like that. When you tap on the button, the toolbar disappears and the menu shows directly above the keyboard. However, in my code it shows above the hidden toolbar, means there is a huge gap between keyboard and menu. It works if I attach the toolbar (in SwiftUI) to the navigation at the top or bottom, just with the keyboard it behaves differently. Here is some sample code for the toolbar: private lazy var accessoryToolbar: UIToolbar = { let toolbar = UIToolbar() let bold = UIBarButtonItem(image: UIImage(systemName: "bold"), style: .plain, target: self, action: #selector(didTapToolbarButton(_:))) let italic = UIBarButtonItem(image: UIImage(systemName: "italic"), style: .plain, target: self, action: #selector(didTapToolbarButton(_:))) let underline = UIBarButtonItem(image: UIImage(systemName: "underline"), style: .plain, target: self, action: #selector(didTapToolbarButton(_:))) let todo = UIBarButtonItem(image: UIImage(systemName: "checklist"), style: .plain, target: self, action: #selector(didTapToolbarButton(_:))) // Make the Bullet button open a menu of list options let bullet = UIBarButtonItem(image: UIImage(systemName: "list.bullet"), style: .plain, target: self, action: nil) let bulletMenu = UIMenu(title: "Insert List", children: [ UIAction(title: "Bulleted List", image: UIImage(systemName: "list.bullet")) { [weak self] _ in self?.handleMenuSelection("Bulleted List") }, UIAction(title: "Numbered List", image: UIImage(systemName: "list.number")) { [weak self] _ in self?.handleMenuSelection("Numbered List") }, ]) bullet.menu = bulletMenu toolbar.items = [bold, italic, underline, todo, bullet] toolbar.sizeToFit() return toolbar }() Somewhere else in the code assign it to the UITextView: // Attach the toolbar above the keyboard textView.inputAccessoryView = accessoryToolbar Toolbar: Menu open: In Apple Notes: I feel I miss something. In Apple notes I can also see a Liquid Glass style gradient between the toolbar and keyboard, dimming the content in this space.
1
0
269
Oct ’25
SwiftUI safe area stays offset after keyboard dismissal with “Reduce Motion” + “Prefer Cross-Fade” enabled (iOS 26)
I’m seeing a layout issue in SwiftUI on iOS 26 that only reproduces with specific Accessibility Motion settings. Steps to reproduce 1. Open Settings → Accessibility → Motion. 2. Enable Reduce Motion and Prefer Cross-Fade Transitions. 3. Launch an app with a SwiftUI TextField. 4. Tap the field to show the keyboard. 5. Dismiss the keyboard (tap outside, swipe down, etc.). Expected: After the keyboard is dismissed, the view’s bottom safe area / layout should return to normal. Actual: The view continues to reserve space equal to the keyboard height — as if the keyboard were still visible. UI anchored to the safe area remains shifted upward until the view is reloaded.
4
2
992
Oct ’25