iOS is the operating system for iPhone.

Posts under iOS tag

200 Posts

Post

Replies

Boosts

Views

Activity

application(_:didFinishLaunchingWithOptions:) launchOptions is nil when app supports scenes
our app support silent push, and we use below code to get if app is launched by silent push: if let remoteNotification = launchOptions?[UIApplication.LaunchOptionsKey.remoteNotification] as? [AnyHashable: Any], let aps = remoteNotification["aps"] as? [AnyHashable: Any], let contentAvailable = aps["content-available"] as? Int, contentAvailable == 1 { isSilentNotification = true } when app is launch and call: application(_:didFinishLaunchingWithOptions:) but when migrate to UIScene, the launchOptions is always nil, and we can not get to know if app is launched by silent push; I read the develop doc: it says: If the app supports scenes, this is nil. Continue using UIApplicationDelegate's application(_:didReceiveRemoteNotification:fetchCompletionHandler:) to process silent remote notifications after scene connection. but the time for method calling application(_:didReceiveRemoteNotification:fetchCompletionHandler:) id too late. So except in didReceiveRemoteNotification method calling: application(_:didReceiveRemoteNotification:fetchCompletionHandler:) are there any other ways to obtain the silent push flag when app is launch and has not didReceiveRemoteNotification call back.
1
0
122
Oct ’25
AVSpeechSynthesisVoice ignores user-selected voices in iOS 26 (Regression)
We've identified a regression in iOS 26.0 and 26.1 Beta 4 where AVSpeechSynthesisVoice(language:) no longer respects user-selected voices from Accessibility settings. Issue: When users select a specific voice in Settings → Accessibility → Spoken Content → Voices, calling AVSpeechSynthesisVoice(language:) returns the system default voice instead of the user's selection. This worked correctly in iOS 18.6.2. Particularly affects: Third-party speech synthesis voices (CereProc, Grammatek, etc.) Apps relying on automatic voice selection based on user preferences Example: // User selected CereProc Heather for en-GB in Accessibility settings let voice = AVSpeechSynthesisVoice(language: "en-GB") print(voice?.name) // iOS 18.6.2: "HEATHER", iOS 26: "Daniel" (system default) Interesting observation: The new Accessibility Reader feature in iOS 26 correctly uses the user-selected voice, but Tap to Speak and the API both ignore the setting. Tested methods: AVSpeechSynthesisVoice(language:) AVSpeechUtterance auto-selection Reflection for new APIs All return the system default voice, not the user's preference. Filed: FB[20271264] Has anyone else encountered this? Any known workarounds to programmatically access the user's preferred voice selection?
4
1
395
Oct ’25
In iOS26, UITableview can't display cell,but cells can be selected!
We have received several cases that our app can not display uitableview cell in iOS26, but users said that they can select cells with single tab and the uitableview didselectcell delegate can response! I have reported a feedback but no response. Does anyone have the same bugs with me? You guys can see that the page is blank, I have a video a user sent to me can proved that he can select cell with gesture. We cannot reproduce the bug and don't konw how to fixed, we think this is the bug with iOS26, so here for some help. This bug block our distribution of new version(support iOS26) This is the feedback https://feedbackassistant.apple.com/feedback/20677046
1
0
148
Oct ’25
How to hide supplementary column alone in three column split view
We are using three column split view as root of our app and wants to hide the supplementary column alone in some cases and behaves like two column split view. With the existing apis we are unable to achieve this since it hides primary column as well and not giving expected results. .hide(.supplementary) setViewController(nil, for: .supplementary) But seen this behavior in Native Notes app when using the View as List and Gallery option. is there any way to achieve this with maintaining three column split view itself ?
1
0
152
Oct ’25
Displaying and working with Favorites in iOS app
New to iOS development and I've been trying to make heads or tails of the documentation. I know there is a difference between the data fields returned from songs from the user library and from the category, but whenever I search on the apple site I can't find a list of each. For example, Im trying to get the releaseDate of a song in my library, but it seems I'll have to cross-query either the catalog entry for the using song.catalogID or the song.irsc but when I try to use them I can't find a cross reference between the two. I'm totally turned around. Also trying to determine if a song in my library has been favorited or not? isFavorited (or something similar) doesn't seem to be a thing. Using this code and trying to find a way to display a solid star if the song has been favorited or an empty one if it's not. Seems like a basic request but I can't find anything on how to do it. I've searched docs, googled, tried. Does apple want us to query the user's Favorited Songs playlist or something? How do I know which playlist that is? I know isFavorited isn't a thing, just using it here so you can see what my intension is: HStack(spacing: 10) { Image(systemName: song.isFavorited ? "star.fill" : "star") .foregroundColor(song.isFavorited ? .yellow : .gray) Image(systemName: "magnifyingglass") }
1
0
179
Oct ’25
Request for clarification of Developer mode
Hi Guys, I want to support my client for enable the developer mode, But they not accept to connect with any other devices(Mac Xcode) to enable developer mode. They are nearly 10 people to enable developer mode. But I think without mac we can't enable developer mode in some of devices. So I need a clarification with IOS versions. That's only we are excepting to list out which IOS versions don't have developer mode option default. Please list out that IOS versions Like below: default developer mode available IOS 17.4.1 default developer mode not available IOS 17.5.1
3
0
892
Oct ’25
Incorrect keyboard frame on iOS 26 when using Secure Text with Autofill
Area: Software Update Type of Feedback: Application Bug Description Device: iPhone 13 Pro running iOS 26 Build environment: Xcode 16.4 Problem description: When a text field has secureTextEntry = YES and Password Autofill / Passkeys is active, the autofill panel is not included in the rect reported from the keyboard notifications (UIKeyboardFrameEndUserInfoKey or others). As a result, when calculating the offset to move the screen up and reveal the hidden input field, the field is not displayed correctly because the reported keyboard height is smaller than the actual visible height. Observed behavior: This only occurs on devices running iOS 26 built with Xcode 16.4. On previous versions of iOS, with the same settings (secureTextEntry and Autofill active), the rect correctly includes the autofill panel height, and the UI works as expected. I tested with both UIKeyboardDidShowNotification and UIKeyboardWillChangeFrameNotification, and in both cases the behavior is the same: the height is incorrect (smaller than expected with the autofill panel). What I expect / questions: That UIKeyboardFrameEndUserInfoKey (or the related notification) correctly reports the total area covered by the keyboard, including any password autofill panel, when secureTextEntry is active. That the new behavior in iOS 26 be documented if this omission is intentional, or otherwise considered a bug if it is not. If there is any official workaround suggested by Apple for developers affected by this issue while a fix is provided. Thank you for your support.
2
2
526
Oct ’25
How to keep API requests running in background using URLSession in Swift?
I'm developing an iOS application in Swift that performs API calls using URLSession.shared. The requests work correctly when the app is in the foreground. However, when the app transitions to the background (for example, when the user switches to another app), the ongoing API calls are either paused or do not complete as expected. What I’ve tried: Using URLSession.shared.dataTask(with:) to initiate the API requests Observing application lifecycle events like applicationDidEnterBackground, but haven't found a reliable solution to allow requests to complete when backgrounded Goal: I want certain API requests to continue running or be allowed to complete even if the app enters the background. Question: What is the correct approach to allow API calls to continue running or complete when the app moves to the background? Should I be using a background URLSessionConfiguration instead of URLSession.shared? If so, how should it be properly configured and used in this scenario?
1
0
127
Oct ’25
Incorrect padding in TextField with .rightToLeft layout direction
When a TextField is set to a rightToLeft layout, it gets strange and unnecessary padding on the left side. This pushes the text away from the edge. This issue doesn't occur in the leftToRight layout, where the text aligns correctly. Does anyone know how to get rid of this extra padding? Environment: Xcode version: 26.0.1 Device: iPhone 13 iOS version: 26.0 Code: struct ContentView: View { @State var textInput: String = "" var body: some View { VStack { Text("rightToLeft") TextField("placeholder", text: $textInput) .background(Color.red) .environment(\.layoutDirection, .rightToLeft) Text("leftToRight") TextField("placeholder", text: $textInput) .background(Color.red) .environment(\.layoutDirection, .leftToRight) } .padding() } }
1
1
214
Oct ’25
How to protect endpoints used by Message Filtering Extension?
Hi, I am just wondering if there is any option to protect my endpoints that will be used by Message Filtering Extension? According to the documentation our API has 2 endpoints: /.well-known/apple-app-site-association /[endpoint setup in the ILMessageFilterExtensionNetworkURL value of the Info.plist file] that the deferQueryRequestToNetwork will request on every message Since all requests to these 2 endpoints are made by iOS itself (deferQueryRequestToNetwork), I don't understand how I can protect these endpoints on my side, like API key, or maybe mTLS. The only way that I found is white list for Apple IP range. Is there other methods for it?
1
0
135
Oct ’25
Playback Issues for DRM content when sending CMCD
Since iOS and tvOS 18, CMCD can now be automatically sent by AVPlayer (https://developer.apple.com/streaming/Whats-new-HLS.pdf). However, after enabling CMCD, our streams occasionally fail with the following error: CoreMediaErrorDomain Error -17383 This issue appears to affect only DRM-protected (FairPlay) streams so far. We activate CMCD via the resource loader of an AVURLAsset, before assigning the item to an AVPlayer. Unfortunately, we haven’t found a reliable way to reproduce the issue, and we’ve been unable to gather any useful diagnostic information. Has anyone else observed this behavior when enabling CMCD on FairPlay streams?
3
0
382
Oct ’25
iOS 26 AttributedString TextEditor bold and italic
Is there a way to constrain the AttributedTextFormattingDefinition of a TextEditor to only bold/italic/underline/strikethrough, without showing the full Font UI? struct CustomFormattingDefinition: AttributedTextFormattingDefinition { struct Scope: AttributeScope { let font: AttributeScopes.SwiftUIAttributes.FontAttribute let underlineStyle: AttributeScopes.SwiftUIAttributes.UnderlineStyleAttribute let strikethroughStyle: AttributeScopes.SwiftUIAttributes.StrikethroughStyleAttribute } var body: some AttributedTextFormattingDefinition<Scope> { ValueConstraint(for: \.font, values: [MyStyle.defaultFont, MyStyle.boldFont, MyStyle.italicFont, MyStyle.boldItalicFont], default: MyStyle.defaultFont) ValueConstraint(for: \.underlineStyle, values: [nil, .single], default: .single) ValueConstraint(for: \.strikethroughStyle, values: [nil, .single], default: .single) } } If I remove the Font attribute from the scope, the Bold and Italic buttons disappear. And if the Font attribute is defined, even if it's restricted to one typeface in 4 different styles, the full typography UI is displayed.
1
0
102
Oct ’25
Can't run app on iPhone after registered UDID
The device UDID was registered to the developer account 40 hours ago, the STATUS column was "processing" in the first 24 hours, then turned to empty. But I still can't run my app (with distribution method "development"), when I try to run it after download it through my OTA URL, it prompts “the app cannot be installed because its integrity could not be verified” but everything runs good on a iPhone which was registered a month ago. What should I do now? keep waiting?
3
1
785
Oct ’25
iOS 26 Keyboard Toolbar with Menu Buttons: Menu appears above hidden toolbar (instead of taking up it's space)
Hi Community! I'm wrangling with this for a few days now and can't find a way to fix it, but I see it working in other apps. I have a regular UIToolbar with buttons, assigning it as inputAccessoryView to a UITextView, so the toolbar appears above the keyboard (I can reproduce this also in SwiftUI). The toolbar contains regular UIBarButtonItems, except one doesn't have an action, it has a UIMenu (using .menu = UIMenu...). In the Apple Notes app, there is also one button that shows a menu like that. When you tap on the button, the toolbar disappears and the menu shows directly above the keyboard. However, in my code it shows above the hidden toolbar, means there is a huge gap between keyboard and menu. It works if I attach the toolbar (in SwiftUI) to the navigation at the top or bottom, just with the keyboard it behaves differently. Here is some sample code for the toolbar: private lazy var accessoryToolbar: UIToolbar = { let toolbar = UIToolbar() let bold = UIBarButtonItem(image: UIImage(systemName: "bold"), style: .plain, target: self, action: #selector(didTapToolbarButton(_:))) let italic = UIBarButtonItem(image: UIImage(systemName: "italic"), style: .plain, target: self, action: #selector(didTapToolbarButton(_:))) let underline = UIBarButtonItem(image: UIImage(systemName: "underline"), style: .plain, target: self, action: #selector(didTapToolbarButton(_:))) let todo = UIBarButtonItem(image: UIImage(systemName: "checklist"), style: .plain, target: self, action: #selector(didTapToolbarButton(_:))) // Make the Bullet button open a menu of list options let bullet = UIBarButtonItem(image: UIImage(systemName: "list.bullet"), style: .plain, target: self, action: nil) let bulletMenu = UIMenu(title: "Insert List", children: [ UIAction(title: "Bulleted List", image: UIImage(systemName: "list.bullet")) { [weak self] _ in self?.handleMenuSelection("Bulleted List") }, UIAction(title: "Numbered List", image: UIImage(systemName: "list.number")) { [weak self] _ in self?.handleMenuSelection("Numbered List") }, ]) bullet.menu = bulletMenu toolbar.items = [bold, italic, underline, todo, bullet] toolbar.sizeToFit() return toolbar }() Somewhere else in the code assign it to the UITextView: // Attach the toolbar above the keyboard textView.inputAccessoryView = accessoryToolbar Toolbar: Menu open: In Apple Notes: I feel I miss something. In Apple notes I can also see a Liquid Glass style gradient between the toolbar and keyboard, dimming the content in this space.
1
0
242
Oct ’25
NavigationSplitView: Make button clickable in detail-section safe area
Hi, I've tried to find a solution for this problem for weeks now but it seems no one knows how to solve it and Apple doesn't seem to care. I have a NavigationSplitView with two columns. In the detail column I have a button - or any other clickable control - which is placed in the very top where usually the safe area resides. The button is NOT clickable when he is in the safe area and I have NO idea why. I know I can place buttons in safe areas of other views and they are clickable. Please have a look at the code: `struct NavTestView: View { var body: some View { GeometryReader { p in VStack(spacing: 0) { NavigationSplitView { List(names) { Text($0.name).frame(width: p.size.width) .background(Color.green) }.listRowSpacing(p.size.height * 0.15 / 100 ) .toolbar(.hidden, for: .navigationBar) } detail: { TestView().ignoresSafeArea() }.frame(width: p.size.width, height: p.size.height, alignment: .topLeading) .background(Color.yellow) } } } } struct TestView: View { var body: some View { GeometryReader { p in let plusButton = IconButton(imageName: "plus.circle.fill", color: Color(uiColor: ThemeColor.SeaFoam.color), imageWidth: p.size.width * 5 / 100, buttonWidth: p.size.width * 5 / 100) let regularAddButton = Button(action: { log.info("| Regular Add Button pressed") } ) { plusButton } VStack { regularAddButton }.frame(width: p.size.width , height: p.size.height, alignment: .top) .background(Color.yellow) } } } ` this code produces the following screen: Any help would be really greatly appreciated! Thank you! Frank
1
0
679
Oct ’25
Profile doesn't include the com.apple.application-identifier entitlement.
I have tried everything and still I am getting this. Just for a test I created a new app (Master-Detail template Xcode 11.5) I have created an entry in the iTunes Connect to receive the app upon archiving and uploading. I regenerated all new certificates for iOS Development and Distribution. I created all new Provisioning profiles. The Dev profile builds deploys and runs on my device The Dist profile builds but when I select the distribution profile I get the "Profile doesn't include the com.apple.application-identifier entitlement." error. When I download the profile within Xcode all looks good for the distribution profile: App ID: matches correctly Certificated: 1 Included includes the new signing certificate "iPhone Distribution...." Capabilities: 3 Included Includes Game Center, In-App Purchase, and Keychain Sharing Entitlements: 5 Included Includes application-identifier, keychain-access-groups, beta-reports-active, get-task-allow, and com.apple.developer.team-identifier. Im not sure what is going on. This is a standard process I have performed for quite a while. As a matter of fact I just submitted 3 applications last Sunday. Thank you for any suggestions.
22
1
14k
Oct ’25
Scroll TextEditor to cursor position
Hello. Is there a good SwiftUI approach on getting the TextEditor cursor position? I have a TextEditor and sometimes when we have a longer text inside it, the cursor is not seen because the keyboard is above covering the bottom of the TextEditor. I would like to somehow detect the position of the cursor, and if it's on the last line of the TextEditor, scroll to the bottom. I've already checked a bit and didn't find any good method of doing this in SwiftUI. If you have any ideas on how to do this, or even a different method any help would be highly appreciated. Thank you!
1
3
814
Oct ’25