Siri and Voice

RSS for tag

Help users quickly accomplish tasks related to your app using just their voice.

Posts under Siri and Voice tag

67 Posts

Post

Replies

Boosts

Views

Activity

Privacy - Siri Usage Description being reset to default text "Describe why your app needs Siri access" on generating archive
I have an iOS app and that has CarPlay enabled. I have Siri capability and the feature has been tested in Car. The voice commands are working perfectly fine. However, I am facing a weird issue as described below, The key NSSiriUsageDescription, is set to custom text in info.plist. After generating archive, I exported and checked the package contents, in which the the key NSSiriUsageDescription was reset to default text(Describe why your app needs Siri access) in the info.plist. I do not have any dynamic build process that's writing to the info.plist. Only the Siri key is being reset, rest of keys like camera/location permissions are intact. Kindly suggest what needs to be done at my end
0
0
65
May ’25
AppEntity with @Parameter Options Works in Shortcuts App but Not with Siri
I’m working with AppIntents and AppEntity to integrate my app’s data model into Shortcuts and Siri. In the example below, I define a custom FoodEntity and use it as a @Parameter in an AppIntent. I’m providing dynamic options for this parameter via an optionsProvider. In the Shortcuts app, everything works as expected: when the user runs the shortcut, they get a list of food options (from the dynamic provider) to select from. However, in Siri, the experience is different. Instead of showing the list of options, Siri asks the user to say the name of the food, and then tries to match it using EntityStringQuery. I originally assumed this might be a design decision to allow hands-free use with voice, but I found that if you use an AppEnum instead, Siri does present a tappable list of options. So now I’m wondering: why the difference? Is there a way to get the @Parameter with AppEntity + optionsProvider to show a tappable list in Siri like it does in Shortcuts or with an AppEnum? Any clarification on how EntityQuery.suggestedEntities() and DynamicOptionsProvider interact with Siri would be appreciated! struct CaloriesShortcuts: AppShortcutsProvider { static var appShortcuts: [AppShortcut] { AppShortcut( intent: AddCaloriesInteractive(), phrases: [ "Add to \(.applicationName)" ], shortTitle: "Calories", systemImageName: "fork" ) } } struct AddCaloriesInteractive: AppIntent { static var title: LocalizedStringResource = "Add to calories log" static var description = IntentDescription("Add Calories using Shortcuts.") static var openAppWhenRun: Bool = false static var parameterSummary: some ParameterSummary { Summary("Calorie Entry SUMMARY") } var displayRepresentation: DisplayRepresentation { DisplayRepresentation(stringLiteral:"Add to calorie log") } @Dependency private var persistenceManager: PersistenceManager @Parameter(title: LocalizedStringResource("Food"), optionsProvider: FoodEntityOptions()) var foodEntity: FoodEntity @MainActor func perform() async throws -> some IntentResult & ProvidesDialog { return .result(dialog: .init("Added \(foodEntity.name) to calorie log")) } } struct FoodEntity: AppEntity { static var defaultQuery = FoodEntityQuery() @Property var name: String @Property var calories: Int init(name: String, calories: Int) { self.name = name self.calories = calories } static var typeDisplayRepresentation: TypeDisplayRepresentation { TypeDisplayRepresentation(name: "Calorie Entry") } static var typeDisplayName: LocalizedStringResource = "Calorie Entry" var displayRepresentation: AppIntents.DisplayRepresentation { DisplayRepresentation(title: .init(stringLiteral: name), subtitle: "\(calories)") } var id: String { return name } } struct FoodEntityQuery: EntityQuery { func entities(for identifiers: [FoodEntity.ID]) async throws -> [FoodEntity] { var result = [FoodEntity]() for identifier in identifiers { if let entity = FoodDatabase.allEntities().first(where: { $0.id == identifier }) { result.append(entity) } } return result } func suggestedEntities() async throws -> [FoodEntity] { return FoodDatabase.allEntities() } } extension FoodEntityQuery: EntityStringQuery { func entities(matching string: String) async throws -> [FoodEntity] { return FoodDatabase.allEntities().filter({$0.name.localizedCaseInsensitiveCompare(string) == .orderedSame}) } } struct FoodEntityOptions: DynamicOptionsProvider { func results() async throws -> ItemCollection<FoodEntity> { ItemCollection { ItemSection("Section 1") { for entry in FoodDatabase.allEntities() { entry } } } } } struct FoodDatabase { // Fake data static func allEntities() -> [FoodEntity] { [ FoodEntity(name: "Orange", calories: 2), FoodEntity(name: "Banana", calories: 2) ] } }
0
1
74
May ’25
Intents UI Extension automatically dismisses
I am working on implementing a new Intents UI Extension and have noticed that when it is triggered via the "Hey Siri" voice command, the intent dismisses after a few seconds. However, if it is launched from the Shortcuts app, the intent remains active and does not dismiss automatically. Additionally, I’ve observed that this behavior occurs on specific iOS versions, such as 17.5.1 or 17.7. On other versions, like 17.4.1 or 18.4, the intent persists as expected. Does Siri automatically close the intent based on its own logic? Could the iOS version be influencing this behavior? Given the requirement to make the intent persistent, is there any option or configuration available to achieve this?
0
0
52
Apr ’25
Use Siri to control parts of my CarPlay app
I am building a CarPlay navigation app and I would like it to be as hands free as possible. I need a better understanding of how Siri can work with CarPlay and if the direction I need to go is using Intents or App Shortcuts. My goal is to be able to have the user speak to Siri and do things like "open settings" or "zoom in map" and then call a func in my app to do what the user is asking. Does CarPlay support this? Do I need to use App Intents or App Shortcuts or something else?
2
0
88
Apr ’25
Disambiguation for .system.search AppIntent
I'd like to display a list of items to disambiguate for a fulltext search intent. Using the Apple AppIntentsSampleApp, I added TrailSearch.swift: import AppIntents @AssistantIntent(schema: .system.search) struct TrailSearch: AppIntent { static let title: LocalizedStringResource = "Search Trail" static let description = IntentDescription("Search trail by name.", categoryName: "Discover", resultValueName: "Trail") @Parameter(title: "Trail") var criteria: StringSearchCriteria func perform() async throws -> some IntentResult & ReturnsValue<TrailEntity> { if criteria.term.isEmpty { throw $criteria.needsValueError(IntentDialog("need value")) } let trails = TrailDataManager.shared.trails { trail in trail.name.contains(criteria.term) } if trails.count > 1 { throw $criteria.needsDisambiguationError(among: trails.map { StringSearchCriteria(term: $0.name) }) } else if let firstTrail = trails.first { return .result(value: TrailEntity(trail: firstTrail)) } throw $criteria.needsValueError(IntentDialog("Nothing found")) } } Now when I type "trail" which matches several trails and thus lets us enter the disambiguation code path, the Shortcut app just displays the dialog title but no disambiguation items to pick from. Is this by design or a bug? (filed as FB17412220)
0
0
61
Apr ’25
Siri Shortcuts of Siri Intent to Voice Control Parts of App
I am new to the idea of Siri Shortcuts and App Intents. What I want to do is use Siri to run a function in my app. Such as saying to Siri Zoom in map and that will then call a function in my app where I can zoom in the map. Similarly, I could say Zoom out map and it would call a function to zoom out my map. I do not need to share any sort of shortcut with the Shortcuts app. Can someone please point me in the right direction for what type of intents I need to use for this?
0
0
60
Apr ’25
Parameter recognition on AppShortcuts invocation not consistent
While playing around with AppShortcuts I've been encountering some problems around getting the invocation phrase detected and/or the parameter get recognized after invocation phrase via Siri. I've found some solutions or explanations here in other posts (Siri not recognizing the parameter in the phrase & Inform iOS about AppShortcutsProvider), but I still have one issue and it's about consistency. For context, I've defined the parameter to be an AppEntity with it's respective query conforming to the EntityStringQuery Protocol in order to be able to fetch entities with the string given by Siri struct AnIntent: AppIntent { // other parts hidden for clarity @Parameter var entity: ModelEntity } For an invocation phrase akin to "Do something with in ", if the user uses the phrase with a entity previously donated via suggestedEntities() the AppShortcut get executed without problems. If the user uses a phrase with no parameter, like "do something with ", if the user gets asked to input the missing parameter and inputs one, it may or may not get recognized and be asked to input a parameter again, like in a loop. This happens even if the parameter given is one that was donated. I've found that when this happens the entities(matching string: String) function in the EntityQuery doesn't get called. The input can be of one word or sometimes two and it will not be called. So in other words entities(matching string: String) does not get called on every user parameter input Is this behavior correct? Do parameters have some restrictions on length or anything? Does Siri shows the user suggested entities when asked for entity input? It doesn't on my end. Additional question related to AppShortcuts: On AppShortcut definition, where the summary inside the parameter presentation is used? I see that it was defined in the AppIntentsSampleApp for the GetTrailInfo Intent but didn't find where it was used
0
0
68
Apr ’25
Issues with Siri Shortcuts: Confirmation Prompt, Inconsistent Behavior
Hello Apple Developer Community, I’m working on integrating Siri into my React Native app using native iOS code and bridging to React Native. I’ve followed the necessary steps to set up Siri support, including: Adding the Siri capability. Adding Siri usage descriptions in Info.plist. Using AppIntent and AppShortcutsProvider to define shortcuts. However, I’m facing the following issues: Siri Prompts for Confirmation When a user says a phrase, Siri asks, "Turn on 'MyApp' shortcuts with Siri?" instead of directly recognizing the phrase. Is this expected behavior? If so, how can I reduce friction for users and make the experience more seamless? Inconsistent Behavior for Existing Users For users updating to a version with Siri support: When the app is closed, Siri says, "MyApp hasn't added support for that with Siri." When the app is open, Siri prompts, "Turn on shortcut for MyApp?" and rest all working fine Why does Siri not recognize the shortcut when the app is closed, even though the shortcut is defined in AppShortcutsProvider? How can I ensure that Siri recognizes the shortcut regardless of whether the app is open or closed? Other than using AppIntent and AppShortcutsProvider should i try Donating shortcuts(will that helps for updated user case). Please help me on this
8
1
500
Apr ’25
Unable to pass parameters from siri phrase to appIntent
I have a chat/search functionality in my app, I want to integrate siri where user can say something "Hey siri! Ask myApp to get latest news" then I want to invoke my search functionality with "get latest news". I see iOS apps like chatGPT and youtube have already achieved this. I am able to invoke the intent with static phrase which is expecting the parameter, user is able to provide the value when prompted after requestValueDialog. But it is a 2 step process for end user. I want to achieve in a single step. struct CombinedSiriShortcuts: AppShortcutsProvider { static var appShortcuts: [AppShortcut] { return [ AppShortcut( intent: ShowSpecificNewsArticleIntent(), phrases: [ "Ask \(.applicationName) to run a query:", ], shortTitle: "Specific News Article", systemImageName: "doc.text.fill" ), AppShortcut( intent: TestQuery(), phrases: [ "Ask \(.applicationName) to \(\.$query)", ], shortTitle: "Test intent", systemImageName: "doc.text.fill" ), ] } } struct ShowSpecificNewsArticleIntent: AppIntent { static var title: LocalizedStringResource = "Show Specific News Article" static var description = IntentDescription( "Provides details about a specific news article based on its title." ) @Parameter(title: "Query") var query: String @MainActor func perform() async throws -> some IntentResult & ProvidesDialog & ShowsSnippetView { print("in show specific intent"); print(query); return .result(dialog: "view more about: \(query)") } }
2
0
139
Mar ’25
Siri Shortcut Phrases iOS -17
I made a set of Siri Shortcuts in my app with the AppShortcutsProvider, and they each have a set of phrases. I can activate the shortcuts via Siri phrases or Spotlight search on iOS 18+, but not on iOS -17. I've checked the documentation and see that AppShortcutsProvider is supported from iOS 16+, so I don't understand why I can't view the shortcuts in Spotlight or activate them with Siri unless it's at least iOS 18. Any thoughts?
4
0
313
Mar ’25
App Clips Advanced Experiences not showing up in Apple Maps and Siri Suggestions
Hello everyone, I’m experiencing an issue with App Clips Advanced Experiences and Apple Maps/Siri Suggestions. We have already contacted Apple Support before, but they are investigating the cause of this issue and it has not been resolved til date. The App Clip is bundled with the main app and has been already available on the App Store for several months. The business running the app has several physical shops and wants to offer the App Clip to show up in Apple Maps and Siri Suggestions at each location. The App Clip is correctly exposed in the AASA file, and it's also validated correctly by the AASA APIs available at https://app-site-association.cdn-apple.com/a/v1. { "applinks": { "apps": [], "details": [ { "appID": "TEAMID.bundleid", "paths": [] } ] }, "appclips": { "apps": [ "TEAMID.bundleid.Clip" ] } } (with TEAMID and bundleid being the team and bundle identifiers of the app) The App Clip is displayed correctly when loading the website and when scanning a QR code or App Clip code, but doesn't appear in the Maps app or in Siri suggestions. We have set up the App Clip Advanced Experiences on the App Store Connect page of the app, and each URL has been linked to a physical shop. All URLs are in the "Received" state, so they should appear correctly on Maps. Unfortunately, I don't see any "Order" button in Apple Maps at any location card. We tried with both iOS 17 and 16. According to feedbacks from people in the shops, they don't see the app suggested in the Siri Suggestions. I have just submitted a Custom Action Link on Apple Business Connect for one of the shops, but without success: the App Clip doesn't appear. Any idea why is this happening?
8
1
924
Mar ’25
CarPlay music sound output from iPhone after enabled Enhanced Siri
With CarPlay communication plugin R18.1, I followed these steps to integrate Enhance siri, the music sound was output from carplay and there is no option for output to Car. ============================================ Enhanced Siri Declare supported audioFormats for the AuxIn and AuxOut streams Since the AuxIn and AuxOut streams for Siri do not have to be both active at the same time, the accessory must claim audio formats support for AuxIn Audio and AuxOut audio independently. The audio formats for each stream can differ from each other (48KHz for AuxOut and 16KHz for AuxIn). The new audio types represent these new streams - AuxIn/speechRecognition & AuxOut/speechRecognition. Check if connected device supports the feature AirPlayReceiverSessionHasFeatureEnhancedSiri() Claim support in the Setup Response message if device supports the feature Add kAirPlayKeyAccessoryEnabledFeature_EnhancedSiri key through the AirPlayReceiverServer delegate AirPlayReceiverServerCopyProperty_f function for the kAirPlayKey_AccessoryEnabledFeatures key. Helper function: CFArrayAppendValue() Add Enhanced Siri parameters dictionary in the INFO message Add dictionary through the AirPlayReceiverServer delegate AirPlayReceiverServerCopyProperty_f function for the kAirPlayKey_EnhancedSiriInfo key. kAirPlayKey_EnhancedSiriInfo dictionary parameters: Voice activation of Siri - kAirPlayKey_EnhancedSiriVoice Current language of voice model - kAirPlayKey_VoiceModelCurrentLanguage Supported languages of voice model - kAirPlayKey_VoiceModelSupportedLanguages Enhanced Button activation of Siri - kAirPlayKey_EnhancedSiriButton Supported zone(s) if any - kAirPlayKey_SupportedSiriTriggerZones Update AudioStream Get state of the AuxIn state by providing an implementation of AudioStreamUpdateState() - Off, local buffering, or streaming to device. Decouple input streams from output streams. AuxIn is an independent input stream only. The property kAudioStreamProperty_Direction will provide the necessary information if the stream is input, output or input & output. Provide a handler for the AirPlayReceiverSessionDelegate setEnhancedSiriParams_f This will provide additional information: Activation type Setting the language of the voice model Invoke the Communication Plugin to start buffering Once the activation type has been specified, the accessory can request the plugin to start buffering using AirPlayReceiverSessionAuxInStart(). Use the new APIs to trigger Siri: AirPlayReceiverSessionRequestSiriActionWithLatency() AirPlayReceiverSessionRequestSiriVoiceActivationWithLatency() AirPlayReceiverSessionRequestSiriVoiceActivationWithSample() Button presses and voice activations should use this new APIs which adds a timestamp of the activation. These APIs allow a choice of a latency or a sample for button and voice activations. If there is a delay between the user pressing the button to notifying the device on the button press, this latency value should represent this time. If the accessory can determine which zone activated, it can provide the zone with the request. Invoke the Communication Plugin to stop buffering You may need to invoke the plugin to stop buffering (AirPlayReceiverSessionAuxInStop()) if exclusive access to the microphone is necessary. Such instances may include but not limited to: Native voice recognition session Telephony Another stream function which uses the microphone starts modesChanged notification can be used to determine if a resource is being used Note, if the session ends, the plugin will automatically stop buffering
0
0
234
Mar ’25
Siri Intent Dialog with custom SwiftUIView not responding to buttons with intent
I have created an AppIntent and added it to shortcuts to be able to read by Siri. When I say the phrase, the Siri intent dialog appears just fine. I have added a custom SwiftUI View inside Siri dialog box with 2 buttons with intents. The callback or handling of those buttons is not working when initiated via Siri. It works fine when I initiate it in shortcuts. I tried using the UIButton without the intent action as well but it did not work. Here is the code. static let title: LocalizedStringResource = "My Custom Intent" static var openAppWhenRun: Bool = false @MainActor func perform() async throws -> some ShowsSnippetView & ProvidesDialog { return .result(dialog: "Here are the details of your order"), content: { OrderDetailsView() } } struct OrderDetailsView { var body: some View { HStack { if #available(iOS 17.0, *) { Button(intent: ModifyOrderIntent(), label : { Text("Modify Order") }) Button(intent: CancelOrderIntent(), label : { Text("Cancel Order") }) } } } } struct ModifyOrderIntent: AppIntent { static let title: LocalizedStringResource = "Modify Order" static var openAppWhenRun: Bool = true @MainActor func perform() async throws -> some OpensIntent { // performs the deeplinking to app to a certain page to modify the order } } struct CancelOrderIntent: AppIntent { static let title: LocalizedStringResource = "Cancel Order" static var openAppWhenRun: Bool = true @MainActor func perform() async throws -> some OpensIntent { // performs the deeplinking to app to a certain page to cancel the order } } Button(action: { if let url = URL(string: "myap://open-order") { UIApplication.shared.open(url) } }
0
0
309
Mar ’25
SIri trigger command conflicts
We're having trouble with getting Siri to hand off specific trigger words to our app via shortcuts. I want to be able to say "Hey Siri Myappname Foobar" but in some cases if Foobar is the name of a specific business it may launch maps instead showing locations of those businesses. Is there any way to inform Siri, "no, *****, launch our app as the shortcut specifies!"
0
1
238
Mar ’25
Siri misreads local currency in notifications (Bug reported, still unresolved)
I’m experiencing an issue where Siri incorrectly announces currency values in notifications. Instead of reading the local currency correctly, it always reads amounts as US dollars. Issue details: My iPhone is set to Region: Chile and Language: Spanish (Chile). In Chile, the currency symbol $ represents Chilean Pesos (CLP), not US dollars. A notification with the text: let content = UNMutableNotificationContent() content.body = "¡Has recibido un pago por $5.000!" is read aloud by Siri as: ”¡Has recibido un pago por 5.000 dólares!” (English: “You have received a payment of five thousand dollars!”) instead of the correct: ”¡Has recibido un pago por 5.000 pesos!” (English: “You have received a payment of five thousand pesos!”) Another developer already reported the same issue back in 2023, and it remains unresolved: https://developer.apple.com/forums/thread/723177 This incorrect behavior is not limited to iOS notifications; it also occurs in other Apple services: watchOS, iPadOS, and macOS (Siri misreads currency values in various system interactions). Siri’s currency conversion feature misinterprets $ as USD even when the device is set to a region where $ represents a different currency. Announce Notifications on AirPods also exhibits this issue, making it confusing when Siri announces transaction amounts incorrectly. Apple Intelligence interactions are also affected—for example, asking Siri to “read my latest emails” when one of them contains a monetary value results in Siri misreading the currency. I have submitted a bug report via Feedback Assistant, and the Feedback ID is FB16561348. This issue significantly impacts accessibility and localization for users in regions where the currency symbol $ is not associated with US dollars. Has anyone found a workaround, or is there any update from Apple on this?
1
1
644
Feb ’25
¿How do I make Siri announce the local currency on notifications?
I'm currently testing the announce notifications feature and I can't seem to find out how to make Siri read aloud the current currency instead of dollars. My locale is es-CL (Chile). It uses the currency symbol $ and reads as Pesos locally or Chilean Pesos where the number 5000.1 is represented as 5.000,1 This is the notification content         let content = UNMutableNotificationContent()         content.body = "¡Has recibido un pago por $5.000!" Siri reads it aloud as "¡Has recibido un pago por 5.000 Dolares!" which translates to "You have received a payment for 5,000 Dollars", instead of the expected "¡Has recibido un pago por 5.000 Pesos!" -> "You have received a payment for 5,000 Pesos" I've tried changing the development region of the app, interpolating the string with NumberFormatter.localizedString(from: 5000, number: .currency), and with others styles( .currencyAccounting, .currencyISOCode and .currencyPlural) without good results. The last one seems to work buts it's not ideal since it outputs "5.000 pesos chilenos" which gets read as "5 pesos chilenos" which is not the correct amount (bug), it's as is you're not on Chile and I personally prefer it to be a symbol instead of words. I'm testing with my device which is setup with the region "Chile" Could someone help me find a solution?
5
1
1.3k
Feb ’25
On iOS 18, Mandarin is read aloud as Cantonese
Please include the line below in follow-up emails for this request. Case-ID: 11089799 When using AVSpeechUtterance and setting it to play in Mandarin, if Siri is set to Cantonese on iOS 18, it will be played in Cantonese. There is no such issue on iOS 17 and 16. 1.let utterance = AVSpeechUtterance(string: textView.text) let voice = AVSpeechSynthesisVoice(language: "zh-CN") utterance.voice = voice 2.In the phone settings, Siri is set to Cantonese
3
1
507
Feb ’25