iOS is the operating system for iPhone.

Posts under iOS tag

200 Posts
Sort by:

Post

Replies

Boosts

Views

Activity

Setting Required Capabilities for Foundation Models
Cross-posting this from https://developer.apple.com/forums/thread/795707 per ask from DTS Engineer: Is there any way to ensure iOS apps we develop using Foundation Models can only be purchasable/downloadable on App Store by folks with capable devices? I would've thought there would be a Required Capabilities that App Store would hook into for Apple Intelligence-capable devices, but I don't seem to see it in the documentation here: https://developer.apple.com/documentation/bundleresources/information-property-list/uirequireddevicecapabilities The closest seems to be iphone-performance-gaming-tier as that seems to target all M1 and above chips on iPhone & iPad. There is an ipad-minimum-performance-m1 that would more reasonably seem to ensure Foundation Models is likely available, but that doesn't help with iPhone. So far, it seems the only path would be to set Minimum Deployment to iOS 26 and add iphone-performance-gaming-tier as a required capability, but I'm a bit worried that capability might diverge in the future from what's Foundation Model / Apple Intelligence capable since we're really wanting the devices with Apple Neural Engine sufficient for Apple Intelligence features in the SDKs (like Foundaton Models, Image Playgrounds, Audio Transcription, etc.) While I understand for the majority of apps they'll want to just selectively add in Apple Intelligence features and so can be usable by folks whose devices don't support it, the app experience I'm building doesn't make sense without the Foundation Models being available and I'd rather not have a large number of users downloading the app to be told "Sorry, your device is not capable of Apple Intelligence and so can't use this app" I've created a Feedback Assistant ticket tracking the question/ask here: FB19366221
0
0
58
4w
BGContinuedProcessingTask code pauses when device is locked
I have been experimenting with the BGContinuedProcessingTask API recently (and published sample code for it https://github.com/infinitepower18/BGContinuedProcessingTaskDemo) I have noticed that if I lock the phone, the code that runs as part of the task stops executing. My sample code simply updates the progress each second until it gets to 100, so it should be completed in 1 minute 40 seconds. However, after locking the phone and checking the lock screen a few seconds later the progress indicator was in the same position as before I locked it. If I leave the phone locked for several minutes and check the lock screen the live activity says "Task failed". I haven't seen anything in the documentation regarding execution of tasks while the phone is locked. So I'm a bit confused if I encountered an iOS bug here?
4
0
126
1w
Issue with beta Declared Age Range
I'm trying to work with the beta version of the Declared Age Range framework based on an article's tutorial but am getting the following error: [C:1-3] Error received: Invalidated by remote connection. and AgeRangeService.Error.notAvailable is being thrown on the call to requestAgeRange. I'm using Xcode 26 beta 5 and my simulator is running the 26.0 beta. The iCloud account that I have signed into the simulator has a DOB set as well. This is my full ContentView where I'm trying to accomplish this. struct ContentView: View { @Environment(\.requestAgeRange) var requestAgeRange @State var advancedFeaturesEnabled = false var body: some View { VStack { Button("Advanced Features") {} .disabled(!advancedFeaturesEnabled) } .task { await requestAgeRangeHelper() } } func requestAgeRangeHelper() async { do { let ageRangeResponse = try await requestAgeRange(ageGates: 16) switch ageRangeResponse { case let .sharing(range): if let lowerBound = range.lowerBound, lowerBound >= 16 { advancedFeaturesEnabled = true } case .declinedSharing: break // Handle declined sharing default: break } } catch AgeRangeService.Error.invalidRequest { print("Invalid request") // Handle invalid request (e.g., age range < 2 years) } catch AgeRangeService.Error.notAvailable { print("Not available") // Handle device configuration issues } catch { print("Other") } } }
1
0
84
4w
SpeechTranscriber extremely slow (14+ seconds) despite proper locale allocation and optimization
Using the official SwiftTranscriptionSampleApp from WWDC 2025, speech transcription takes 14+ seconds from audio input to first result, making it unusable for real-time applications. Environment iOS: 26.0 Beta Xcode: Beta 5 Device: iPhone 16 pro Sample App: Official Apple SwiftTranscriptionSampleApp from WWDC 2025 Configuration Tested Locale: en-US (properly allocated with AssetInventory.allocate(locale:)) and es-ES Setup: All optimizations applied (preheating, high priority, model retention) I started testing in my own app to replace SFSpeech API and include speech detection but after long fights with documentation (this part is quite terrible TBH) I tested the example (https://developer.apple.com/documentation/speech/bringing-advanced-speech-to-text-capabilities-to-your-app) and saw same results. I added some logs to check the specific time: 🎙️ [20:30:41.532] ✅ Analyzer started successfully - ready to receive audio! 🎙️ [20:30:41.532] Listening for transcription results... 🎙️ [20:30:56.342] 🚀 FIRST TRANSCRIPTION RESULT after 14.810s: 'Hello' (isFinal: false) Questions Is this expected performance for iOS 26 Beta, because old SFSpeech is far faster? Are there additional optimization steps for SpeechTranscriber? Should we expect significant performance improvements in later betas?
1
0
129
3w
encountering a console warning when accessing NSUserDefaults within the willFinishLaunchingWithOptions method. However, it appears that all the key values are loading correctly despite the warning.
App initiates a App group based UserDefaults within the willFinishLaunchingWithOptions method and the same reference are used throughout the app life cycle + (NSUserDefaults *)appGroupUserDefaults { if (_appGroupUserDefaults == nil) { NSString *appGroupIdentifier = [NSString stringWithFormat:@"group.%@",[[NSBundle mainBundle] bundleIdentifier]]; NSUserDefaults *groupDefaults = [[NSUserDefaults standardUserDefaults] initWithSuiteName:appGroupIdentifier]; if(groupDefaults != nil) { NSLog(@"[DB_ENCRYPTION] appGroupUserDefaults initialised"); _appGroupUserDefaults = groupDefaults; } else { NSLog(@"<ALA_ERROR>: [CCF-OS] [DB_ENCRYPTION] %s Unable to create NSUSerDefaults with groupIdentifier", __func__); _appGroupUserDefaults = [NSUserDefaults standardUserDefaults]; } } return _appGroupUserDefaults; } Doesn't have any issues on accessing the values but seen the below console error: Couldn't read values in CFPrefsPlistSource<0x103eedb00> (Domain: group.com.kodiak.InstaPoC, User: kCFPreferencesAnyUser, ByHost: Yes, Container: (null), Contents Need Refresh: Yes): Using kCFPreferencesAnyUser with a container is only allowed for System Containers, detaching from cfprefsd Does it require any action here ?
2
0
51
4w
App Icon issue in Wallet app
Hi, Upon reviewing our app, we got feedback that our app icon within the Wallet app is not behaving as expected when the home screen is set to "light mode" only. In that case, on the home screen, the app icon remains its default color (e.g., red), regardless of the device's appearance settings (light or dark), which is expected. However, in the apple Wallet, e.g., under the From Apps from your device, app icons change their color (e.g., red in light mode, black in dark mode) when iOS appearance is changed - which is reported as an app issue. I've noticed that all apps in that section are changing the color, not just ours, so it seems to me like a bug in iOS or a behavior that was not clearly defined in the app store guidelines. If there is an API we must use to cover that case, which one would that be? Is this a bug that Apple should resolve, or is this the intended behaviour?
0
2
361
4w
Errors reading not-yet-sync'd iCloud files get cached
I have an app which uses ubiquitous containers and files in them to share data between devices. It's a bit unusual in that it indexes files in directories the user grants access to, which may or may not exist on a second device - those files are identified by SHA-1 hash. So a second device scanning before iCloud data has fully sync'd can create duplicate references which lead to an unpleasant user experience. To solve this, I store a small binary index in the root of the ubiquitous file container of the shared data, containing all of the known hashes, and as the user proceeds through the onboarding process, a background thread is attempting to "prime" the ubiquitous container by calling FileManager.default.startDownloadingUbiquitousItemAt() for each expected folder and file in a sane order. This likely creates a situation not anticipated by the iOS/iCloud integration's design, as it means my app has a sort of precognition of files it should not yet know about. In the common case, it works, but there is a corner case where iCloud sync has just begun, and very, very little metadata is available (the common case, however, in an emulator), in which two issues come up: I/O may hang indefinitely, trying to read a file as it is arriving. This one I can work around by running the I/O in a thread created with the POSIX pthread_create and using pthread_cancel to kill it after a timeout. Attempts to call FileManager.default.startDownloadingUbiquitousItemAt() fails with an error Error Domain=NSCocoaErrorDomain Code=257 "The file couldn’t be opened because you don’t have permission to view it.". The permissions aspect of it is nonsense, but I can believe there's no applicable "sort of exists, sort of doesn't" error code to use and someone punted. The problem is that this same error will be thrown on any attempt to access that file for the life of the application - a restart is required to make it usable. Clearly, the error or the hallucinated permission failure is cached somewhere in the bowels of iOS's FileManager. I was hoping startAccessingSecurityScopedResource() would allow me to bypass such a cache, as it does with URL.resourceValues() returning stale file sizes and last modified times. But it does not. Is there some way to clear this state without popping up a UI with an Exit button (not exactly the desired iOS user experience)?
1
0
53
Aug ’25
Setting Required Capabilities for Foundation Models
Is there any way to ensure iOS apps we develop using Foundation Models can only be purchasable/downloadable on App Store by folks with capable devices? I would've thought there would be a Required Capabilities that App Store would hook into, but I don't seem to see it in the documentation here: https://developer.apple.com/documentation/bundleresources/information-property-list/uirequireddevicecapabilities The closest seems to be iphone-performance-gaming-tier as that seems to target all M1 and above chips on iPhone & iPad. There is an ipad-minimum-performance-m1 that would more reasonably seem to ensure Foundation Models is likely available, but that doesn't help with iPhone. So far, it seems the only path would be to set Minimum Deployment to iOS 26 and add iphone-performance-gaming-tier as a required capability, but I'm a bit worried that capability might diverge in the future from what's Foundation Model / Apple Intelligence capable. While I understand for the majority of apps they'll want to just selectively add in Apple Intelligence features and so can be usable by folks whose devices don't support it, the app experience I'm building doesn't make sense without the Foundation Models being available and I'd rather not have a large number of users downloading the app to be told "Sorry, you're not Apple Intelligence capable"
2
2
188
4w
iOS 26 AttributedString TextEditor bold and italic
Is there a way to constrain the AttributedTextFormattingDefinition of a TextEditor to only bold/italic/underline/strikethrough, without showing the full Font UI? struct CustomFormattingDefinition: AttributedTextFormattingDefinition { struct Scope: AttributeScope { let font: AttributeScopes.SwiftUIAttributes.FontAttribute let underlineStyle: AttributeScopes.SwiftUIAttributes.UnderlineStyleAttribute let strikethroughStyle: AttributeScopes.SwiftUIAttributes.StrikethroughStyleAttribute } var body: some AttributedTextFormattingDefinition<Scope> { ValueConstraint(for: \.font, values: [MyStyle.defaultFont, MyStyle.boldFont, MyStyle.italicFont, MyStyle.boldItalicFont], default: MyStyle.defaultFont) ValueConstraint(for: \.underlineStyle, values: [nil, .single], default: .single) ValueConstraint(for: \.strikethroughStyle, values: [nil, .single], default: .single) } } If I remove the Font attribute from the scope, the Bold and Italic buttons disappear. And if the Font attribute is defined, even if it's restricted to one typeface in 4 different styles, the full typography UI is displayed.
0
0
51
Aug ’25
Inconsistent VoIP Push Behavior Post Network Restoration
We are observing unexpected behavior in Apple Push Notification Service (APNS) delivery and would appreciate clarification and guidance. Below is a detailed breakdown of the scenario and related questions. Abbreviations: APNP – Apple Push Notification Provider APNS – Apple Push Notification Service Scenario: User1 is registered on iOS device1. Flight Mode is enabled on iOS device1. User2 initiates a call to User1 (Time t = 0 sec). User2 cancels the outgoing call after 5 seconds (Time t = 5 sec). Flight Mode is disabled on iOS device1 after 20 seconds (Time t = 25 sec). Observation: iOS device1 displays an incoming call notification (CallKit UI) after flight mode is turned off, despite the call being cancelled by User2. This notification disappears automatically after approximately 8–10 seconds. Logic Flow: At time t = 0, our APNP sends a VoIP push (priority) to APNS for the incoming call. Since device1 is in flight mode, APNS cannot deliver the push. At t = 25 sec, after flight mode is turned off, APNS delivers the cached VoIP push to device1. The app takes ~5 seconds to initialize (CSDK setup, SIP registration, etc.). It eventually receives a SIP NOTIFY with state="full" and empty dialog info (indicating no active call). Consequently, the CallKit incoming call is removed after ~8 seconds. Questions: → We set the apns-expiration header to 0, expecting that the VoIP push would not be delivered if the device was unreachable when the push was sent. However, APNS still delivers the push 20–30 seconds later, once the device is back online. Q. Why is the apns-expiration header not respected in this case? → Upon receiving the VoIP push, we require ~10–12 seconds to determine if a visible CallKit notification is still relevant (e.g., by completing SIP registration and checking for active dialogs). Q. Is it acceptable, per Apple guidelines, to intentionally delay showing the CallKit UI (incoming call) for 10–15 seconds after receiving the VoIP push? → Apple documentation states that the priority VoIP push channel should be used only for notifying incoming calls, while regular (non-VoIP) pushes should be used for other updates, including call cancellations. Q. What is the rationale behind discouraging the use of the priority VoIP push channel for call cancellation events? In some cases, immediate cancellation notification is as critical as the initial incoming call. Would Apple consider it acceptable to occasionally use the priority VoIP channel for rare call-cancellation scenarios without risking throttling or suspension? → In our implementation, we send an incoming call notification via the priority VoIP channel. Shortly after, we send a call cancellation notification on the regular push channel, marked with "content-available": 1. We expect this regular push to wake the app (triggering application:didReceiveRemoteNotification:fetchCompletionHandler:), but in practice the app never wakes, and our debug logs inside that delegate method never appear. Q. Under what exact conditions does a "content-available": 1 regular push fail to wake the app when it follows a VoIP push? Are there additional requirements (e.g., background modes, rate limits, power optimizations) that could prevent the delegate from being called? → According to Apple documentation: “APNs stores only one notification per bundle ID. When multiple notifications are sent to the same device for the same bundle ID, APNs keeps only the latest one.” However, in our tests: If a device is offline when APNs receives both: (a) a priority VoIP push for an incoming call, (b) a regular push for call cancellation (same bundle ID), Upon the device reconnecting, APNs still delivers the earlier VoIP push, instead of discarding it and delivering only the most recent (cancellation) notification. Q. Why doesn’t APNs replace the queued VoIP push with the newer regular push when both share the same bundle ID? Is this expected behavior due to channel type differences (VoIP vs. regular), or is there a way to ensure that the latest notification (even if regular) supersedes the earlier VoIP push? We’d appreciate your input or recommendations on handling such delayed pushes and any best practices for VoIP push expiration handling and call UI timing.
0
1
58
Aug ’25
New App taking over Bundle Id from old app.
Hi everyone We have brought the iOS development in-house from a consulting firm and have developed a new app that will replace the old one. To minimize disruption for users of the old app during this upgrade, we would like to release the new app as an update to the old one, using the Bundle ID from the old app. It is important to note that the old app has already been released through the App Store. The two apps share no code and are almost incompatible on any points. Is it possible to change the Bundle ID and delete UserDefaults from the old app during the transition to the new app? We look forward to your input!
4
0
64
4w
How to keep bottom toolbar pinned, when sheet's detent are changed in code
I am trying to add a pinned(always visible and not moving) toolbar to the bottom of UISheetPresentationController or presentationDetent. This is achievable when the user is dragging to change the size of the sheet. But when changing the detent by code, it does not work. For example, when changing from medium to large, the ViewController's size is changed to large then moved up to the position. From the user's POV, anything at the bottom to disappear for a few frames then reappears. From my testing, when using UISheetPresentationController, ViewController only knows the medium and large state's frame and bounds, so manually setting it up won't work. Is there a way to keep views at the bottom always visible in this case? The sample code below is in SwiftUI for easy recreation, and the responses can be in UIKit. Sample Code: struct ContentView: View { @State var showSheet: Bool = true @State var detent: PresentationDetent = .medium var body: some View { EmptyView() .sheet(isPresented: $showSheet) { ScrollView { } .safeAreaInset(edge: .bottom) { Button { detent = (detent == .medium) ? .large : .medium } label: { Text("Change Size") } .buttonStyle(.borderedProminent) .presentationDetents([.medium, .large], selection: $detent) .presentationBackgroundInteraction(.enabled) } } } }
1
0
78
Aug ’25
Delay in Microphone Input When Talking While Receiving Audio in PTT Framework (Full Duplex Mode)
Context: I am currently developing an app using the Push-to-Talk (PTT) framework. I have reviewed both the PTT framework documentation and the CallKit demo project to better understand how to properly manage audio session activation and AVAudioEngine setup. I am not activating the audio session manually. The audio session configuration is handled in the incomingPushResult or didBeginTransmitting callbacks from the PTChannelManagerDelegate. I am using a single AVAudioEngine instance for both input and playback. The engine is started in the didActivate callback from the PTChannelManagerDelegate. When I receive a push in full duplex mode, I set the active participant to the user who is speaking. Issue When I attempt to talk while the other participant is already speaking, my input tap on the input node takes a few seconds to return valid PCM audio data. Initially, it returns an empty PCM audio block. Details: The audio session is already active and configured with .playAndRecord. The input tap is already installed when the engine is started. When I talk from a neutral state (no one is speaking), the system plays the standard "microphone activation" tone, which covers this initial delay. However, this does not happen when I am already receiving audio. Assumptions / Current Setup Because the audio session is active in play and record, I assumed that microphone input would be available immediately, even while receiving audio. However, there seems to be a delay before valid input is delivered to the tap, only occurring when switching from a receive state to simultaneously talking. Questions Is this expected behavior when using the PTT framework in full duplex mode with a shared AVAudioEngine? Should I be restarting or reconfiguring the engine or audio session when beginning to talk while receiving audio? Is there a recommended pattern for managing microphone readiness in this scenario to avoid the initial empty PCM buffer? Would using separate engines for input and output improve responsiveness? I would like to confirm the correct approach to handling simultaneous talk and receive in full duplex mode using PTT framework and AVAudioEngine. Specifically, I need guidance on ensuring the microphone is ready to capture audio immediately without the delay seen in my current implementation. Relevant Code Snippets Engine Setup func setup() { let input = audioEngine.inputNode do { try input.setVoiceProcessingEnabled(true) } catch { print("Could not enable voice processing \(error)") return } input.isVoiceProcessingAGCEnabled = false let output = audioEngine.outputNode let mainMixer = audioEngine.mainMixerNode audioEngine.connect(pttPlayerNode, to: mainMixer, format: outputFormat) audioEngine.connect(beepNode, to: mainMixer, format: outputFormat) audioEngine.connect(mainMixer, to: output, format: outputFormat) // Initialize converters converter = AVAudioConverter(from: inputFormat, to: outputFormat)! f32ToInt16Converter = AVAudioConverter(from: outputFormat, to: inputFormat)! audioEngine.prepare() } Input Tap Installation func installTap() { guard AudioHandler.shared.checkMicrophonePermission() else { print("Microphone not granted for recording") return } guard !isInputTapped else { print("[AudioEngine] Input is already tapped!") return } let input = audioEngine.inputNode let microphoneFormat = input.inputFormat(forBus: 0) let microphoneDownsampler = AVAudioConverter(from: microphoneFormat, to: outputFormat)! let desiredFormat = outputFormat let inputFramesNeeded = AVAudioFrameCount((Double(OpusCodec.DECODED_PACKET_NUM_SAMPLES) * microphoneFormat.sampleRate) / desiredFormat.sampleRate) input.installTap(onBus: 0, bufferSize: inputFramesNeeded, format: input.inputFormat(forBus: 0)) { [weak self] buffer, when in guard let self = self else { return } // Output buffer: 1920 frames at 16kHz guard let outputBuffer = AVAudioPCMBuffer(pcmFormat: desiredFormat, frameCapacity: AVAudioFrameCount(OpusCodec.DECODED_PACKET_NUM_SAMPLES)) else { return } outputBuffer.frameLength = outputBuffer.frameCapacity let inputBlock: AVAudioConverterInputBlock = { inNumPackets, outStatus in outStatus.pointee = .haveData return buffer } var error: NSError? let converterResult = microphoneDownsampler.convert(to: outputBuffer, error: &error, withInputFrom: inputBlock) if converterResult != .haveData { DebugLogger.shared.print("Downsample error \(converterResult)") } else { self.handleDownsampledBuffer(outputBuffer) } } isInputTapped = true }
4
0
274
3w
When removing a ToolbarItem from the navigation bar, how do I make the remaining ToolbarItems resize correctly?
Because .searchable does not allow for customizing buttons in the search bar, I've manually had to recreate the search bar as shown below. However, when removing one of the items in the search bar, the TextField does not resize correctly and effectively inserts padding on the leading edge. When the TextField is focused, it resizes and fills the entire space. If the "Compose" button was already hidden when the search bar is presented, it lays out correctly. How do I resize the TextField after removing the "Compose" button automatically? Thanks, jjp struct ContentView: View { @State var isSearchBarVisible = false @State var isComposingMessage = false @State var searchText = "" let items: [String] = ["hey", "there", "how", "are", "you"] var searchItems: [String] { items.filter { item in item.lowercased().contains(searchText.lowercased()) } } var body: some View { NavigationStack { VStack { List { if !searchText.isEmpty { ForEach(searchItems, id: \.self) { item in Text(item) } } else { ForEach(items, id: \.self) { item in Text(item) } } } } .toolbar { if isSearchBarVisible { ToolbarItem(placement: .principal) { TextField("Search", text: $searchText) .padding(8) .background(Color.gray.opacity(0.2)) } ToolbarItem(placement: .topBarTrailing) { Button(action: { isSearchBarVisible = false },[![enter image description here][1]][1] label: { Text("Cancel") }) } if !isComposingMessage { ToolbarItem(placement: .topBarTrailing) { Button(action: { isComposingMessage.toggle() }, label: { Text("Compose") }) } } } else { ToolbarItem(placement: .topBarLeading) { Button(action: { isSearchBarVisible = true }, label: { Text("Search") }) } ToolbarItem(placement: .principal) { Text("Title") } ToolbarItem(placement: .topBarTrailing) { Button(action: { isComposingMessage.toggle() }, label: { Text("Compose") }) } } } } } }
1
0
82
Aug ’25
Problem with login access and security on iOS 26, iPhone 13 Pro Max
Hi, I have an iPhone 13 Pro Max running iOS 26.0 (beta). After installing the update, I get a message stating that I need to update my Apple account settings. When I log in, it freezes and nothing can be done. I also can't access the login and security settings, as it also freezes and won't progress. Also, the option to install beta versions has disappeared. I don't know why? Could it be that since I have the latest beta, the option disappears until a new update? Please help!
0
0
183
Aug ’25
Sandbox restriction Error 159: Tap to Pay integration in iOS app
Hi all, I am hope someone could assist me with the below error if you ever ran into this during Tap-to-Pay functionality integration on iOS/iPadOS devices. Pre-condition: I have received required entitlements for the Tap-to-Pay integration and created Sandbox test account to validate my development work. Used updated development profile with the new capabilities required for the Integration. When i try to test my flow, i keep receiving this error, with multiple sandbox accounts in developer portal. Error (refreshContext): proxy error handler [ Error Domain=NSCocoaErrorDomain Code=4099 "The connection to service named com.apple.merchantd.transaction was invalidated: failed at lookup with error 159 - Sandbox restriction." UserInfo={NSDebugDescription=The connection to service named com.apple.merchantd.transaction was invalidated: failed at lookup with error 159 - Sandbox restriction.} ] I greatly appreciate if anyone faced this issue or any knowledge on how to address this error. Thanks in advance. Best regards, Govardhan.
0
0
114
Jul ’25