I have an iOS app, and I am trying to add a companion WatchOS app. My iOS app depends on 2 libraries:
GoogleMobileAds
FirebaseAnalyticsWithoutAdIdSupport
When I add a new target for WatchOS, the preview build starts to fail. I am not adding any libraries to WatchOS. The Google Ads and Firebase Analytics libs are only under the iOS target.
I am unable to run the preview, I get an error when trying to build the watch scheme. The preview does not work. The build just crashes. I've included the error log below.
But, here are the steps I've tried so far:
Delete folders inside Derived Data
Run a clean build (Cmd + Option + Shift + K)
Delete scheme and create a new one
Reset Package Cache
Restart Xcode
Restart Macbook
But, it just does not work. I do not understand why the watchOS target is erroring for "GoogleUserMessagingPlatform" and "GoogleMobileAdsTarget" when those packages are not linked/used for the watchOS.
SchemeBuildError: Failed to build the scheme “timerWatch Watch App”
While building for watchOS Simulator, no library for this platform was found in '/Users/k/Library/Developer/Xcode/DerivedData/timer-dhkdhvfcqtfgskfdxpmupujswtuh/SourcePackages/artifacts/swift-package-manager-google-user-messaging-platform/UserMessagingPlatform/UserMessagingPlatform.xcframework'. (in target 'UserMessagingPlatformTarget' from project 'GoogleUserMessagingPlatform')
Build target UserMessagingPlatformTarget:
/Users/k/Library/Developer/Xcode/DerivedData/timer-dhkdhvfcqtfgskfdxpmupujswtuh/SourcePackages/artifacts/swift-package-manager-google-user-messaging-platform/UserMessagingPlatform/UserMessagingPlatform.xcframework:1:1: error: While building for watchOS Simulator, no library for this platform was found in '/Users/k/Library/Developer/Xcode/DerivedData/timer-dhkdhvfcqtfgskfdxpmupujswtuh/SourcePackages/artifacts/swift-package-manager-google-user-messaging-platform/UserMessagingPlatform/UserMessagingPlatform.xcframework'. (in target 'UserMessagingPlatformTarget' from project 'GoogleUserMessagingPlatform')
Build target GoogleMobileAdsTarget:
/Users/k/Library/Developer/Xcode/DerivedData/timer-dhkdhvfcqtfgskfdxpmupujswtuh/SourcePackages/artifacts/swift-package-manager-google-mobile-ads/GoogleMobileAds/GoogleMobileAds.xcframework:1:1: error: While building for watchOS Simulator, no library for this platform was found in '/Users/k/Library/Developer/Xcode/DerivedData/timer-dhkdhvfcqtfgskfdxpmupujswtuh/SourcePackages/artifacts/swift-package-manager-google-mobile-ads/GoogleMobileAds/GoogleMobileAds.xcframework'. (in target 'GoogleMobileAdsTarget' from project 'GoogleMobileAds')
Selecting any option will automatically load the page
Post
Replies
Boosts
Views
Activity
Howdy,
I'm trying to figure out how to replicate the following behavior for our app:
The system is able to ascertain that the Mac equivalent of some iOS app is installed locally, and it prevents notifications from being mirrored. However, I am unable to determine how this association is inferred. When I check our iOS app under this prefpane, the switch remains enabled and toggleable—we'd like to act like Slack here.
My initial assumption is that an app group containing both the Mac and iOS apps can be used to create the association; however, I would like to confirm that this is indeed the case before doing so. I'm not terribly confident about this.
Details:
The bundle identifiers of both apps do not match. This also applies to Slack; its iOS app is com.tinyspeck.chatlyio while its Mac app is com.tinyspeck.slackmacgap.
In our case, the iOS app's identifier is like com.company.app while the Mac app's identifier is com.company.app.desktop.
Both apps are signed with certificates that have matching team identifiers. The com.apple.developer.team-identifier entitlement is present on the Mac app.
The Mac app shares a keychain access group with the iOS app.
The Mac app is not sandboxed.
The Mac app is an Electron app.
The Mac app does not use APNs. It sends notifications "locally".
I currently only have the iOS app installed on my iPhone via TestFlight, if that matters.
Notification mirroring does work, but we'd like to forcibly disable this by associating the apps together.
To my knowledge, the iOS app makes use of both a UNNotificationServiceExtension and a UNNotificationContentExtension.
The iOS app currently doesn't have an assigned category (at least in Xcode). The Mac app is currently miscategorized as a developer tool (LSApplicationCategoryType = "public.app-category.developer-tools";), but that should be fixed.
(Redacted) bundle information for the Mac app:
CFBundleDisplayName = App;
CFBundleExecutable = "App Desktop";
CFBundleName = App;
Note that our CFBundleExecutable differs from the bundle's display name/name because we're currently migrating our users to a new version of the app that they'd likely want to live alongside the new one. The filename of the bundle itself is, similarly, App Desktop.app.
For the iOS app, to my knowledge, the CFBundleName and CFBundleDisplayName are App.
When a new application runs on iOS 18.4 simulator and tries to access the Speech Framework, prompting a request for authorisation to use Speech Recognition, the application will crash if the user clicks allow. Same issue in the visionOS 2.4 simulator.
Using Swift 6. Report Identifier: FB17686186
/// Checks speech recognition availability and requests necessary permissions.
@MainActor
func checkAvailabilityAndPermissions() async {
logger.debug("Checking speech recognition availability and permissions...")
// 1. Verify that the speechRecognizer instance exists
guard let recognizer = speechRecognizer else {
logger.error("Speech recognizer is nil - speech recognition won't be available.")
reportError(.configurationError(description: "Speech recognizer could not be created."), context: "checkAvailabilityAndPermissions")
self.isAvailable = false
return
}
// 2. Check recognizer availability (might change at runtime)
if !recognizer.isAvailable {
logger.error("Speech recognizer is not available for the current locale.")
reportError(.configurationError(description: "Speech recognizer not available."), context: "checkAvailabilityAndPermissions")
self.isAvailable = false
return
}
logger.trace("Speech recognizer exists and is available.")
// 3. Request Speech Recognition Authorization
// IMPORTANT: Add `NSSpeechRecognitionUsageDescription` to Info.plist
let speechAuthStatus = SFSpeechRecognizer.authorizationStatus()
logger.debug("Current Speech Recognition authorization status: \(speechAuthStatus.rawValue)")
if speechAuthStatus == .notDetermined {
logger.info("Requesting speech recognition authorization...")
// Use structured concurrency to wait for permission result
let authStatus = await withCheckedContinuation { continuation in
SFSpeechRecognizer.requestAuthorization { status in
continuation.resume(returning: status)
}
}
logger.debug("Received authorization status: \(authStatus.rawValue)")
// Now handle the authorization result
let speechAuthorized = (authStatus == .authorized)
handleAuthorizationStatus(status: authStatus, type: "Speech Recognition")
// If speech is granted, now check microphone
if speechAuthorized {
await checkMicrophonePermission()
}
} else {
// Already determined, just handle it
let speechAuthorized = (speechAuthStatus == .authorized)
handleAuthorizationStatus(status: speechAuthStatus, type: "Speech Recognition")
// If speech is already authorized, check microphone
if speechAuthorized {
await checkMicrophonePermission()
}
}
}
Getting this error in iPhone Portrait Mode with notch.
Currrently using AVQueuePlayer to play more than 30 mp3 files one by one.
All constraint properties are correct but error occures only in Apple iPhone Portrait Mode with notch series. But same code works on same iPhone in Landscape mode.
**But I get this error: **
LoudnessManager.mm:709 unable to open stream for LoudnessManager plist
Type: Error | Timestamp: 2025-02-07 | Process: | Library: AudioToolbox | Subsystem: com.apple.coreaudio | Category: aqme | TID: 0x42754
LoudnessManager.mm:709 unable to open stream for LoudnessManager plist
LoudnessManager.mm:709 unable to open stream for LoudnessManager plist
Timestamp: 2025-02-07 | Library: AudioToolbox | Subsystem: com.apple.coreaudio | Category: aqme
Hi,
I'm trying to setup PIR service for live caller id lookup (in python but based on swift example: https://github.com/apple/live-caller-id-lookup-example). The swift example provides utilities for database setup and encryption, but I can't find any specification about which key is used for database encryption and how the ios system knows about this key in order to be able to construct the PIR requests.
So my question is how does the PIR service communicate the secret key to ios system or vice versa? (specific to the test environment, before onboarding)
Problem Description
I am using CLLocationManager to obtain the device's compass heading (direction), and I have encountered an abnormal behavior:
When the user is stationary: After calling startUpdatingHeading(), the CLHeading object returned in the locationManager(_:didUpdateHeading:) callback correctly reflects the device’s actual physical orientation (i.e., the direction the top of the device is pointing) in terms of magnetic north / true north, via the magneticHeading and trueHeading properties. When I rotate the device, the heading values change accordingly — this is the expected behavior.
But when the user is in motion (e.g., driving a car): Even if I rotate the device, the values of magneticHeading and trueHeading no longer reflect the device’s actual orientation. Instead, they consistently return what appears to be the user's or vehicle's travel direction (forward direction). In other words, the compass behaves as if it is reporting the direction of motion rather than the device’s actual facing direction.
Only after the user has completely stopped moving, does rotating the device again result in magneticHeading and trueHeading reflecting the actual device orientation as expected.
However, on another device running iOS 16 (iPhone XR), this behavior does not occur — everything works normally.
Expected Behavior
I expect that regardless of whether the user is moving or not, the CLHeading values returned by CLLocationManager should always represent the physical orientation of the device itself (i.e., which direction the top of the device is pointing), as a standard compass should.
Actual Behavior
User is stationary, rotating the device: magneticHeading / trueHeading change properly according to the device’s actual orientation
User is in motion (e.g., driving):magneticHeading / trueHeading remain fixed to the direction of motion (travel direction), and do not change when the device is rotated
User stops moving, then rotates the device:Compass behaves normally again, reflecting the actual device orientation
Environment Information
iOS Version: iOS 26.0.1
Device Models: iPhone 15 Pro / iPhone 17 Pro
Xcode Version: Xcode 26.0.1
Language: Objective-C
Questions
Is this a known issue in iOS? Are there any related radars or official documentation about it?
Have other developers encountered similar issues, especially where CLHeading behaves incorrectly when the user is in motion?
Do I need to set any specific parameters in CLLocationManager (such as headingOrientation) to resolve or work around this issue?
🙏 Thank you for your help — any insights, experiences, or official feedback regarding this issue would be greatly appreciated!
I have an application named "XY" that has been launched in several countries. Now, I intend to launch it in Turkey, but we are facing legal issues preventing us from using "XY" as the app's display name. Following the documentation, I localized the app's display name to "ZX" for both Turkish and English (Turkey). However, when users change their device settings, they do not see an option for English (Turkey) language selection. I assumed that for Turkish users, English (Turkey) would be the default language, but this is not the case. Could someone please assist me in resolving this issue? I've investigated options for localizing the display name based on region, but it seems that this functionality isn't feasible on iOS. In contrast, it's relatively straightforward to achieve on Android platforms.
Hi everyone,
I’m currently testing iOS 26 on my iPhone as part of the developer program. According to Apple’s documentation and demo materials, a new screenshot animation was introduced in this version. However, when I take a screenshot on my device, the animation remains the same as in previous iOS versions.
I’ve double-checked that I’m running the correct build of iOS 26, and I haven’t found any settings that might enable or disable this feature.
Is anyone else experiencing the same issue? Could this new animation be device-specific, region-limited, or require additional configuration?
Any insight would be appreciated!
Thanks in advance,
Alonso Rivera
I am a developer on an enterprise application. Our team just updated our pipeline to build our app on the iOS 18 SDK instead of the 17.4 SDK and this has caused a lot of our ui elements to change and several crashes within the app resulting in just the simple error message "Swift runtime failure: unhandled C++ / Objective-C exception".
Why is just updating the SDK causing all these issues? Is there anyway to keep the previous version or will we have to go component by component to fix the constraints and crashes? These issues seem to be happening to our users on iOS 18 and beyond.
I need a layout where I have a ScrollView with some content, and ScrollView has full screen background image. Screen is pushed as detail on stack.
When my screen is pushed we display navigation bar. We want a new scrollEdgeEffectStyle .soft style work. But when we scroll the gradient blur effect bellow bars is fixed to top and bottom part of the scroll view background image and is not transparent. However when content underneath navigation bar is darker and navigation bar changes automatically to adapt content underneath the final effect looks as expected doesn't use background image.
Expected bahaviour for us is that the effect under the navigation bar would not use background image but would be transparent based on content underneath.
This is how it is intialy when user didn't interact with the screen:
This is how it looks when user scrolls down:
This is how it looks when navigation bar adapts to dark content underneath:
Minimal code to reproduce this behaviour:
import SwiftUI
@main
struct SwiftUIByExampleApp: App {
var body: some Scene {
WindowGroup {
ContentView()
}
}
}
struct ContentView: View {
var body: some View {
NavigationStack {
ScrollView(.vertical) {
VStack(spacing: 0.0) {
ForEach(1 ..< 101, id: \.self) { i in
HStack {
Text("Row \(i)")
Spacer()
}
.frame(height: 50)
.background(Color.random)
}
}
}
.scrollEdgeEffectStyle(.soft, for: .all)
.scrollContentBackground(.hidden)
.toolbar {
ToolbarItem(placement: .title) {
Label("My Awesome App", systemImage: "sparkles")
.labelStyle(.titleAndIcon)
}
}
.toolbarRole(.navigationStack)
.background(
ZStack {
Color.white
.ignoresSafeArea()
Image(.sea)
.resizable()
.ignoresSafeArea()
.scaledToFill()
}
)
}
}
}
extension Color {
static var random: Color {
Color(
red: .random(in: 0...1),
green: .random(in: 0...1),
blue: .random(in: 0...1)
)
}
}
We've also tried using ZStack instead of .background modifier but we observed the same results.
We want to basically achieve the same effect as showcased here, but with the static background image:
https://youtu.be/3MugGCtm26A?si=ALG29NqX1jAMacM5&t=634
Hi Team,
I'm using the simple mailto functionality in the demo page and when I tested the same via Safari mailto functionality is not working.
However, the same feature is working as expected via Chrome.
Demo: https://jsfiddle.net/xut0ed4y/
Kindly help.me to resolve this issue.
Device: iPhone 16 Pro Max
System Version: 18.3.1
Screen width and height obtained using [UIScreen mainScreen].bounds.size are as follows
Why are the two results different?
Hello,
I’m developing a third-party VoIP app called Heyno and trying to support Siri-initiated calls so they behave like WhatsApp / FaceTime, especially from the lock screen.
Target behavior
From the locked device, the user says:
“Hey Siri, call <contact> using Heyno”
Expected result:
• System CallKit audio-call UI appears.
• No “continue in ” sheet, no forced unlock or foregrounding.
• Our app handles the VoIP leg in the background via CXProviderDelegate.
WhatsApp already does this with:
“Hey Siri, call <contact> on WhatsApp”
I’m trying to reproduce that behavior for Heyno using public APIs.
I have followed the SiriKit + CallKit VoIP docs but cannot get a clean Siri → CallKit → app flow from the lock screen without either:
Being forced into .continueInApp (unlock + foreground), or
Hitting CallKit transaction errors when starting the call from the app in response to the intent.
Current implementation
Intents extension (INStartCallIntentHandling)
• resolveContacts(for:with:) normalizes to E.164 and returns INPersonResolutionResult.success.
• resolveDestinationType → .success(.normal).
• resolveCallCapability → .success(.audioCall).
Confirm / handle currently:
func confirm(intent: INStartCallIntent,
completion: @escaping (INStartCallIntentResponse) -> Void) {
completion(INStartCallIntentResponse(code: .ready, userActivity: nil))
}
func handle(intent: INStartCallIntent,
completion: @escaping (INStartCallIntentResponse) -> Void) {
completion(INStartCallIntentResponse(code: .ready, userActivity: nil))
}
Earlier, I used .continueInApp with an NSUserActivity carrying the normalized number and metadata, but that always produced a “Continue in Heyno” sheet that requires unlock and foreground, which breaks the lock-screen Siri flow.
App target – CallKit provider
In the app I have CXProvider + CXProviderDelegate, which work correctly when calls are initiated from inside the app:
func provider(_ provider: CXProvider, perform action: CXStartCallAction) {
let handle = action.handle.value
// Start VoIP / WebRTC / LiveKit / Asterisk call here
provider.reportOutgoingCall(with: action.callUUID,
startedConnectingAt: Date())
provider.reportOutgoingCall(with: action.callUUID,
connectedAt: Date())
action.fulfill()
}
If I construct a CXStartCallAction and submit it via CXCallController.request(...) from the app, CallKit UI appears and our pipeline runs correctly.
What I tried and what fails
Starting CallKit from the Intents extension
Calling CXCallController.request(...) directly from handle(intent:completion:) in the extension always yields:
com.apple.CallKit.error.requesttransaction error 1 (unentitled)
The extension does not have the CallKit entitlement, and the docs say not to initiate calls from the extension, so this path seems unsupported.
Using .continueInApp + NSUserActivity
Pattern:
• handle(intent:) builds NSUserActivity (activityType = NSStringFromClass(INStartCallIntent.self), title = "Heyno Start Call", userInfo with E.164 handle, etc.).
• Returns INStartCallIntentResponse(code: .continueInApp, userActivity: activity).
• App receives the activity, then starts CallKit + VoIP.
Functionally this works, but iOS always requires unlock + foreground (“Continue in Heyno”), which is not acceptable for a Siri lock-screen call.
App group + Darwin notification (extension → app → CallKit)
Experiment:
• Extension writes the normalized number into an app-group UserDefaults.
• Extension posts a Darwin notification.
• App (if running) listens, reads the number, and initiates CXStartCallAction + VoIP.
Observed:
• Works only when the app is already running in the background; a killed app is not woken.
• In some states I see CXErrorCodeRequestTransactionError.invalidAction (error 6) if I try to issue a CXStartCallAction while CallKit is already doing something as part of the Siri flow.
• Siri sometimes replies “There was a problem with the app,” likely because CallKit rejects the transaction or sees duplicate/conflicting actions.
My understanding so far
• The Intents extension should resolve/confirm the intent but not start the call.
• The source of truth for starting a call should be:
Siri → CallKit → app’s CXProviderDelegate.provider(_:perform: CXStartCallAction)
• The app then starts the VoIP leg, reports started/connected, and fulfills.
Where I am stuck
What is not clear is how Siri is supposed to route an INStartCallIntent into CallKit for a third-party VoIP app on a locked device without using .continueInApp.
If my extension simply:
• resolves the contact,
• confirm → .ready,
• handle → .ready (no NSUserActivity, no CallKit),
I do not see a documented mechanism that causes:
“Hey Siri, call <contact> using Heyno”
on the lock screen to:
• Present a CallKit audio call bound to Heyno, and
• Deliver CXStartCallAction to my CXProviderDelegate while the app stays in the background.
Questions
For third-party VoIP apps today, is it recommended to implement INStartCallIntentHandling at all, or should we rely only on CallKit registration and Siri’s built-in support for “Call with ” (no SiriKit extension)?
If an INStartCallIntentHandling extension is still the intended pattern:
• Should confirm/handle simply return .ready and never start CallKit or set NSUserActivity?
• In that case, is Siri expected to invoke CallKit on our behalf and create a CXStartCallAction targeting our provider, even when the device is locked and the app is not foreground?
Is there any supported way for a Siri-triggered third-party VoIP call to start from the lock screen via CallKit without:
• using .continueInApp (unlock + foreground), and
• starting CallKit directly from the Intents extension (unentitled)?
Is there any additional configuration, entitlement, provisioning profile flag, or Info.plist key required so that Siri can map “Call using Heyno” directly to our CallKit provider and background VoIP implementation?
Current options:
• .continueInApp + NSUserActivity → works, but always requires unlock + app UI.
• Start CallKit from the extension → fails with “unentitled” and appears unsupported.
• Extension → app-group + notification → app → CallKit → VoIP → fragile, with intermittent CXErrorCodeRequestTransactionError.invalidAction.
• Remove the extension and hope Siri/CallKit auto-routes to our provider → unclear if this is supported for third-party VoIP apps or reserved for privileged apps.
I would appreciate guidance on the intended architecture for this scenario, and whether the “Siri from lock screen → CallKit UI → background VoIP call” flow is achievable for an App Store VoIP app like Heyno using public APIs only.
Hello.
We're developing an app with Flutter that receives VoIP calls. However, when the app is in the background or closed, the push notification arrives, but the call doesn't. It works perfectly when the app is open. We use Firebase, Flutter, and JANUS WebRTC. We need to know what type of permissions or actions we should consider so that the app opens when it receives a call. How can we resolve this issue? Thank you very much.
Good day,
I've uploaded a build to TestFlight, but received an automated response with the following error:
ITMS-90426: Invalid Swift Support - The SwiftSupport folder is missing. Rebuild your app using the current public (GM) version of Xcode and resubmit it.
Our project started in Objective-C and have mixed swift class and pods. The last uploaded build without any automated response was Nov 8, 2023.
I'm using XCode Version 26.0.1 (17A400). I've tried every way i found in internet and i'm not able to find any solution for this.
ALWAYS_EMBED_SWIFT_STANDARD_LIBRARIES = YES
use_frameworks! :linkage => :dynamic (in pods)
We would appreciate any assistance in clarifying why this issue is occurring and how we should proceed to address it. Your guidance would mean a lot.
Thank you.
Topic:
App Store Distribution & Marketing
SubTopic:
TestFlight
Tags:
Swift Packages
App Store
iOS
App Submission
Hello Apple Developer Community,
I'm investigating Core ML model loading behavior and noticed that even when the compiled model path remains unchanged after an APP update, the first run still triggers an "uncached load" process. This seems to impact user experience with unnecessary delays.
Question: Does Core ML provide any public API to check whether a compiled model (from a specific .mlmodelc path) is already cached in the system?
If such API exists, we'd like to use it for pre-loading decision logic - only perform background pre-load when the model isn't cached.
Has anyone encountered similar scenarios or found official solutions? Any insights would be greatly appreciated!
Start from clean iOS 18.4 simulator. Application tried to request authorisation from user for microphone access. Clicking allow caused the application crashed.
Used Swift 6. Report Identifier FB17686864.
De officiële Syrische vlag met twee sterren is vervangen door de huidige Syrische vlag met drie sterren, wetende dat geen enkel land ter wereld de vlag van het land vervangt zonder deze in de grondwet van het land op te nemen.
Daarom is het beter om de Syrische vlag met twee sterren te behouden in de volgende iOS 18.4-release, of beide vlaggen samen te voegen, omdat de meeste Syriërs niet de voorkeur geven aan de vlag met drie sterren.
Only people using iPhone 15 and iPhone 15Pro (don't know about iPhone 15 plus or iPhone 15 pro max) are having problems with my App. All seems fine on 13, 14 and 16 as well as iPad The app is in testflight now. I cannot replicate the issue in MAC via virtual iPhone 15 , 15 plus, pro, or promax. What I see happening - it looks like users are seeing labels disappear, sometimes buttons are disappearing on the 15 pro and 15.
I have an ingredient selection page where you can select the ingredients that you have. These are outlined and grouped to make choosing ingredients intuitive. I have a profile selector where you can choose by flavor, strength, body or mood. At the bottom I have three buttons , button one lets you choose if drinks are sorted or strictly matched. The last button allows the user to see the drinks they can make based on ingredients they have.This is done by matching the ingredients with a locally placed drinks list which contains a drink id, drink name, ingredients and profile information. Clicking the last button opens a flatlist.
Users on iPhone 15 and iPhone 15Pro iOS 18 sometimes experience the three buttons at the bottom being gone altogether. then returning. After clicking the Drinks Available button the button label should change to hide available drinks, but sometimes that label disappears. The drinks flat list has space for many drinks but the labels for those drinks are not present until halfway down the list where one drink shows up.
No other device behaves this way. It might be more common when there are large number of ingredients selected ....e.g., if about 50% of 211 ingredients selected it might be more likely to happen. This needs to be tested to verify,
I have multiple web views of the same domain that share the same local storage, as expected.
One of them though, is loading a .webarchive file.
The web archive is of the same domain, and is loaded using the same base URL.
For some reason, in most cases, the local storage is not shared with this web view when loading the web archive, although if I make that same web view load the actual live web page it does share local storage.
I say in most cases, because for some users it works as expected, but for a significant portion of users it isn't sharing local storage.
I think that the main difference between working and not is iOS version. iOS 17 seems to be able to share the local storage but iOS 18 does not. I can't find anything related in the release notes of iOS 18 versions.
There is nothing in the documentation for load(_:mimeType:characterEncodingName:baseURL:), or the header file, that explains anything specific about local storage and webarchive loading.
Does anyone know for sure how local storage is handled when a webarchive is loaded into a web view, and did something change with iOS 18 in regards to this?