I dont know if this is the appropriate forum for this.
Answers I've found on the web points me towards intentions, but somehow I couldnt make it work.
Im trying to activate siri on carplay to ask user for voice input then make a search.
Is this a custom intent capability or is there any other way.
Siri and Voice
RSS for tagHelp users quickly accomplish tasks related to your app using just their voice.
Posts under Siri and Voice tag
72 Posts
Sort by:
Post
Replies
Boosts
Views
Activity
I have a chat/search functionality in my app, I want to integrate siri where user can say something "Hey siri! Ask myApp to get latest news" then I want to invoke my search functionality with "get latest news". I see iOS apps like chatGPT and youtube have already achieved this.
I am able to invoke the intent with static phrase which is expecting the parameter, user is able to provide the value when prompted after requestValueDialog. But it is a 2 step process for end user. I want to achieve in a single step.
struct CombinedSiriShortcuts: AppShortcutsProvider {
static var appShortcuts: [AppShortcut] {
return [
AppShortcut(
intent: ShowSpecificNewsArticleIntent(),
phrases: [
"Ask \(.applicationName) to run a query:",
],
shortTitle: "Specific News Article",
systemImageName: "doc.text.fill"
),
AppShortcut(
intent: TestQuery(),
phrases: [
"Ask \(.applicationName) to \(\.$query)",
],
shortTitle: "Test intent",
systemImageName: "doc.text.fill"
),
]
}
}
struct ShowSpecificNewsArticleIntent: AppIntent {
static var title: LocalizedStringResource = "Show Specific News Article"
static var description = IntentDescription(
"Provides details about a specific news article based on its title."
)
@Parameter(title: "Query")
var query: String
@MainActor
func perform() async throws -> some IntentResult & ProvidesDialog & ShowsSnippetView {
print("in show specific intent");
print(query);
return .result(dialog: "view more about: \(query)")
}
}
Topic:
App & System Services
SubTopic:
Automation & Scripting
Tags:
Siri and Voice
SiriKit
App Intents
I made a set of Siri Shortcuts in my app with the AppShortcutsProvider, and they each have a set of phrases.
I can activate the shortcuts via Siri phrases or Spotlight search on iOS 18+, but not on iOS -17.
I've checked the documentation and see that AppShortcutsProvider is supported from iOS 16+, so I don't understand why I can't view the shortcuts in Spotlight or activate them with Siri unless it's at least iOS 18.
Any thoughts?
Topic:
App & System Services
SubTopic:
Automation & Scripting
Tags:
Spotlight
Siri and Voice
Shortcuts
App Intents
I'll ask Siri: What is the weather?"
and will get a valid response
I'll ask Siri to execute a shortcut my app has created
I get "what is the order?" (a phrase nowhere in my app)
I'll repeat the question about the weather
I now get "what is the order?"
***?
With CarPlay communication plugin R18.1, I followed these steps to integrate Enhance siri, the music sound was output from carplay and there is no option for output to Car.
============================================
Enhanced Siri
Declare supported audioFormats for the AuxIn and AuxOut streams
Since the AuxIn and AuxOut streams for Siri do not have to be both active at the same time, the accessory must claim audio formats support
for AuxIn Audio and AuxOut audio independently. The audio formats for each stream can differ from each other (48KHz for AuxOut and 16KHz
for AuxIn). The new audio types represent these new streams - AuxIn/speechRecognition & AuxOut/speechRecognition.
Check if connected device supports the feature
AirPlayReceiverSessionHasFeatureEnhancedSiri()
Claim support in the Setup Response message if device supports the feature
Add kAirPlayKeyAccessoryEnabledFeature_EnhancedSiri key through the AirPlayReceiverServer delegate AirPlayReceiverServerCopyProperty_f
function for the kAirPlayKey_AccessoryEnabledFeatures key.
Helper function: CFArrayAppendValue()
Add Enhanced Siri parameters dictionary in the INFO message
Add dictionary through the AirPlayReceiverServer delegate AirPlayReceiverServerCopyProperty_f
function for the kAirPlayKey_EnhancedSiriInfo key.
kAirPlayKey_EnhancedSiriInfo dictionary parameters:
Voice activation of Siri - kAirPlayKey_EnhancedSiriVoice
Current language of voice model - kAirPlayKey_VoiceModelCurrentLanguage
Supported languages of voice model - kAirPlayKey_VoiceModelSupportedLanguages
Enhanced Button activation of Siri - kAirPlayKey_EnhancedSiriButton
Supported zone(s) if any - kAirPlayKey_SupportedSiriTriggerZones
Update AudioStream
Get state of the AuxIn state by providing an implementation of AudioStreamUpdateState() - Off, local buffering, or streaming to device.
Decouple input streams from output streams. AuxIn is an independent input stream only. The property kAudioStreamProperty_Direction will
provide the necessary information if the stream is input, output or input & output.
Provide a handler for the AirPlayReceiverSessionDelegate setEnhancedSiriParams_f
This will provide additional information:
Activation type
Setting the language of the voice model
Invoke the Communication Plugin to start buffering
Once the activation type has been specified, the accessory can request the plugin to start buffering using
AirPlayReceiverSessionAuxInStart().
Use the new APIs to trigger Siri:
AirPlayReceiverSessionRequestSiriActionWithLatency()
AirPlayReceiverSessionRequestSiriVoiceActivationWithLatency()
AirPlayReceiverSessionRequestSiriVoiceActivationWithSample()
Button presses and voice activations should use this new APIs which adds a timestamp of the activation. These APIs allow
a choice of a latency or a sample for button and voice activations.
If there is a delay between the user pressing the button to notifying the device on the button press, this latency value
should represent this time.
If the accessory can determine which zone activated, it can provide the zone with the request.
Invoke the Communication Plugin to stop buffering
You may need to invoke the plugin to stop buffering (AirPlayReceiverSessionAuxInStop()) if exclusive access to the microphone is necessary.
Such instances may include but not limited to:
Native voice recognition session
Telephony
Another stream function which uses the microphone starts
modesChanged notification can be used to determine if a resource is being used
Note, if the session ends, the plugin will automatically stop buffering
I have created an AppIntent and added it to shortcuts to be able to read by Siri. When I say the phrase, the Siri intent dialog appears just fine. I have added a custom SwiftUI View inside Siri dialog box with 2 buttons with intents. The callback or handling of those buttons is not working when initiated via Siri. It works fine when I initiate it in shortcuts. I tried using the UIButton without the intent action as well but it did not work. Here is the code.
static let title: LocalizedStringResource = "My Custom Intent"
static var openAppWhenRun: Bool = false
@MainActor
func perform() async throws -> some ShowsSnippetView & ProvidesDialog {
return .result(dialog: "Here are the details of your order"), content: {
OrderDetailsView()
}
}
struct OrderDetailsView {
var body: some View {
HStack {
if #available(iOS 17.0, *) {
Button(intent: ModifyOrderIntent(), label : {
Text("Modify Order")
})
Button(intent: CancelOrderIntent(), label : {
Text("Cancel Order")
})
}
}
}
}
struct ModifyOrderIntent: AppIntent {
static let title: LocalizedStringResource = "Modify Order"
static var openAppWhenRun: Bool = true
@MainActor
func perform() async throws -> some OpensIntent {
// performs the deeplinking to app to a certain page to modify the order
}
}
struct CancelOrderIntent: AppIntent {
static let title: LocalizedStringResource = "Cancel Order"
static var openAppWhenRun: Bool = true
@MainActor
func perform() async throws -> some OpensIntent {
// performs the deeplinking to app to a certain page to cancel the order
}
}
Button(action: {
if let url = URL(string: "myap://open-order") {
UIApplication.shared.open(url)
}
}
We're having trouble with getting Siri to hand off specific trigger words to our app via shortcuts. I want to be able to say "Hey Siri Myappname Foobar" but in some cases if Foobar is the name of a specific business it may launch maps instead showing locations of those businesses. Is there any way to inform Siri, "no, *****, launch our app as the shortcut specifies!"
Hello Apple Developer Community,
I’m working on integrating Siri into my React Native app using native iOS code and bridging to React Native. I’ve followed the necessary steps to set up Siri support, including:
Adding the Siri capability.
Adding Siri usage descriptions in Info.plist.
Using AppIntent and AppShortcutsProvider to define shortcuts.
However, I’m facing the following issues:
Siri Prompts for Confirmation
When a user says a phrase, Siri asks, "Turn on 'MyApp' shortcuts with Siri?" instead of directly recognizing the phrase. Is this expected behavior? If so, how can I reduce friction for users and make the experience more seamless?
Inconsistent Behavior for Existing Users
For users updating to a version with Siri support:
When the app is closed, Siri says, "MyApp hasn't added support for that with Siri."
When the app is open, Siri prompts, "Turn on shortcut for MyApp?" and rest all working fine
Why does Siri not recognize the shortcut when the app is closed, even though the shortcut is defined in AppShortcutsProvider? How can I ensure that Siri recognizes the shortcut regardless of whether the app is open or closed? Other than using AppIntent and AppShortcutsProvider should i try Donating shortcuts(will that helps for updated user case). Please help me on this
I’m experiencing an issue where Siri incorrectly announces currency values in notifications. Instead of reading the local currency correctly, it always reads amounts as US dollars.
Issue details:
My iPhone is set to Region: Chile and Language: Spanish (Chile).
In Chile, the currency symbol $ represents Chilean Pesos (CLP), not US dollars.
A notification with the text:
let content = UNMutableNotificationContent()
content.body = "¡Has recibido un pago por $5.000!"
is read aloud by Siri as:
”¡Has recibido un pago por 5.000 dólares!”
(English: “You have received a payment of five thousand dollars!”)
instead of the correct:
”¡Has recibido un pago por 5.000 pesos!”
(English: “You have received a payment of five thousand pesos!”)
Another developer already reported the same issue back in 2023, and it remains unresolved: https://developer.apple.com/forums/thread/723177
This incorrect behavior is not limited to iOS notifications; it also occurs in other Apple services:
watchOS, iPadOS, and macOS (Siri misreads currency values in various system interactions).
Siri’s currency conversion feature misinterprets $ as USD even when the device is set to a region where $ represents a different currency.
Announce Notifications on AirPods also exhibits this issue, making it confusing when Siri announces transaction amounts incorrectly.
Apple Intelligence interactions are also affected—for example, asking Siri to “read my latest emails” when one of them contains a monetary value results in Siri misreading the currency.
I have submitted a bug report via Feedback Assistant, and the Feedback ID is FB16561348.
This issue significantly impacts accessibility and localization for users in regions where the currency symbol $ is not associated with US dollars.
Has anyone found a workaround, or is there any update from Apple on this?
Topic:
Accessibility & Inclusion
SubTopic:
General
Tags:
Siri and Voice
User Notifications
Localization
Apple Intelligence
Hi,
I’m trying to get an array of strings from the user using AppIntents, but I’m encountering an issue. The shortcut ends without prompting the user for input or saving the value, though it doesn’t crash. I need to get the user to input multiple tasks in an array, but the current approach isn’t working as expected.
Here’s the current method I’m using:
// Short code snippet showing the current method
private func collectTasks() async throws -> [String] {
var collectedTasks: [String] = tasks ?? []
while true {
if !collectedTasks.isEmpty {
let addMore = try await $input.requestConfirmation("Would you like to add another task?")
if !addMore {
break
}
}
let newTask = try await $input.requestValue("Please enter a task:")
collectedTasks.append(newTask)
}
return collectedTasks
}
The Call
func perform() async throws -> some IntentResult {
let finalTasks = try await collectTasks()
// Some more Code
}
Any advice or suggestions would be appreciated. Thanks in advance!
(Public dupe of FB16477656)
The Shortcuts app allows you to parameterise the input for an action using variables or allowing "Ask every time". This option DOES NOT show when conforming my AppEntity.defaultQuery Struct to EntityStringQuery:
But it DOES shows when confirming to EntityQuery:
As discussed on this forum post (or FB13253161) my AppEntity.defaultQuery HAS TO confirm to EntityStringQuery to allow for searching by String from Siri Voice input.
To summarise:
With EntityQuery:
My Intent looks like it supports variables via the Shortcuts app. But will end up in an endless loop because there is no entities(matching string: String) function.
This will allow me to choose an item via the Shorcuts.app UI
With EntityStringQuery:
My Intent does not support variables via the Shortcuts app.
I am not allows to choose an item via the Shorcuts.app UI.
Even weirder, if i set up the shortcut with using a build with EntityQuery and then do another build with EntityStringQuery it works as expected.
Code:
/*
Works with Siri to find a match, doesn't show "Ask every time"
*/
public struct WidgetStationQuery: EntityStringQuery {
public init() { }
public func entities(matching string: String) async throws -> [Station] {
let stations = [Station(id: "car", name: "car"), Station(id: "bike", name: "bike")]
return stations.filter { $0.id.lowercased() == string.lowercased() }
}
public func entities(for identifiers: [Station.ID]) async throws -> [Station] {
let stations = [Station(id: "car", name: "car"), Station(id: "bike", name: "bike")]
return stations.filter { identifiers.contains($0.id.lowercased()) }
}
public func suggestedEntities() async throws -> [Station] {
return [Station(id: "car", name: "car"), Station(id: "bike", name: "bike")]
}
public func defaultResult() async -> Station? {
try? await suggestedEntities().first
}
}
/*
DOES NOT work with Siri to find a match, but Shortcuts shows "Ask every time"
*/
public struct WidgetBrokenStationQuery: EntityQuery {
public init() { }
public func entities(matching string: String) async throws -> [Station] {
let stations = [Station(id: "car", name: "car"), Station(id: "bike", name: "bike")]
return stations.filter { $0.id.lowercased() == string.lowercased() }
}
public func entities(for identifiers: [Station.ID]) async throws -> [Station] {
let stations = [Station(id: "car", name: "car"), Station(id: "bike", name: "bike")]
return stations.filter { identifiers.contains($0.id.lowercased()) }
}
public func suggestedEntities() async throws -> [Station] {
return [Station(id: "car", name: "car"), Station(id: "bike", name: "bike")]
}
public func defaultResult() async -> Station? {
try? await suggestedEntities().first
}
}```
Topic:
App & System Services
SubTopic:
Automation & Scripting
Tags:
Siri and Voice
Shortcuts
Intents
App Intents
I've been stuck for days trying to figure out how to extract the full text of a Siri prompt that launches my app. We need to be able to get the text of the full command, such as "Hey siri, buy dogfood...." so I can get "dogfood" or anything else following 'buy' . The examples I am finding are a) out of date or b) incomplelete. Right now we're using AppIntents with Shortcuts, but have to use dedicated shortcuts for each specific purchase, which are obviously very limiting.
How to uninstall/delete Voice Control on macOS so that I can test my app for the case when the initial use of Voice Control causes it to be downloaded from Apple? Is there a folder in the macOS System or Library to delete to force a re-download of Voice Control?
My macOS app uses the older NSSpeechRecognizer to handle speech commands, but to use NSSpeechRecognizer required authorization via [SFSpeechRecognizer requestAuthorization...]. I do this and on a macOS system it can trigger a download of Voice Control, the macOS feature. An alert appears with:
"A 390 MB download is required to use speech recognition features in MyApp. You may need to quit and open MyApp again after download completes."
I like to suggest a different microphone dot icon for Voice Control. I had customized Voice Control to turn on a Flashlight. This caused confusion with the orange dot being switched on constantly.
I made an error in sending a security vulnerability to Apple Security about the orange dot microphone in always ON mode when iPhone is unlocked.
IOS 18.3 Beta
This started at BETA 18.2
Only when plugged in to an external power bank.
Ideas or workaround?
Or just wait till the next BETA version?
Please include the line below in follow-up emails for this request.
Case-ID: 11089799
When using AVSpeechUtterance and setting it to play in Mandarin, if Siri is set to Cantonese on iOS 18, it will be played in Cantonese. There is no such issue on iOS 17 and 16.
1.let utterance = AVSpeechUtterance(string: textView.text)
let voice = AVSpeechSynthesisVoice(language: "zh-CN")
utterance.voice = voice
2.In the phone settings, Siri is set to Cantonese
In this thread, I asked about adding parameters to App Shortcuts. The conclusion that I've drawn so far is that for App Shortcuts, there cannot be any parameters in the prompt, otherwise the system cannot find the AppShortcutsProvider. While this is fine for Shortcuts and non-voice interaction, I'd like to find a way to add parameters to the prompt. Here is the scenario:
My app controls a device that displays some content on "pages." The pages are defined in an AppEnum, which I use for Shortcuts integration via App Intents. The App Intent functions as expected, and is able to change the page based on the user selection within Shortcuts (or prompted if using the App Shortcut). What I'd like to do is allow the user to be able to say "Siri, open with ."
So far, The closest I've come to understanding how this works is through the .intentsdefinition file you can create (and SiriKit in general), however the part that really confused me there is a button in the File Editor that says "Convert to App Intent." To me, this means that I should be able to use the app intent I've already authored and hook that into Siri, rather than making an entirely new function/code-block that does exactly the same thing. Ideally, that's what I want to do.
What's the right way to define this behavior?
p.s. If I had to pick an intent schema in the context of AssistantSchemas, I'd say it's closest to the "Open File" one, if that helps. I'd ultimately like to make the "pages" user-customizable so in the long run, that would be what I'd do.
Topic:
Machine Learning & AI
SubTopic:
Apple Intelligence
Tags:
Siri and Voice
SiriKit
App Intents
Apple Intelligence
I have an image based app with albums, except in my app, albums are known as galleries.
When I tried to conform my existing OpenGalleryIntent with @AssistantIntent(schema: .photos.openAlbum), I had to change my existing gallery parameter to be called target in order to fit the predefined shape of this domain.
Previously, my intent was configured to display as “Open Gallery” with the description “Opens the selected Gallery” in the Shortcuts app. After conforming to the photos domain, it displays as “Open Album” with a description “Opens the Provided Album”.
Shortcuts is ignoring my configured title and description now. My code builds, but with the following build warnings:
Parameter argument title of a required Assistant schema intent parameter target should not be overridden
Implementation of the property title of an AppIntent conforming to AssistantSchemaIntent should not be overridden
Implementation of the property description of an AppIntent conforming to AssistantSchemaIntent should not be overridden
Is my only option to change the concept of a Gallery inside of my app into an Album? I don't want to do this... Conceptually, my app aligns well with this domain does, but I didn't consider that conforming to the shape of an AI schema intent would also dictate exactly how it's presented to the user.
FB16283840
Topic:
Machine Learning & AI
SubTopic:
Apple Intelligence
Tags:
Siri and Voice
Shortcuts
App Intents
Apple Intelligence
Given that iOS 18.2 is out and following documentation and WWDC example (limited to iOS 18.2+), I am attempting to use @AssistantIntent(schema: .system.search) along an AppIntent.
Questions:
Has anyone made this to work on a real device?!
In my case (code below): when I run the intent from Shortcuts or Siri, it does NOT open the App but only calls the perform method (and the App is not foregrounded) -- changing openAppWhenRun has no effect! Strangely: If my App was backgrounded before invocation and I foreground it after, it has navigated to Search but just not foregrounded the App!
Am I doing anything wrong? (adding @Parameter etc doesn't change anything).
Where is the intelligence here? The criteria parameter can NOT be used in the Siri phrase -- build error if you try that since only AppEntity/AppEnum is permitted as variable in Siri phrase but not a StringSearchCriteria.
Said otherwise: What's the gain in using @AssistantIntent(schema: .system.search) vs a regular AppIntent in this case?!
Some code:
@available(iOS 18.2, *)
@AssistantIntent(schema: .system.search)
struct MySearchIntent: ShowInAppSearchResultsIntent {
static let searchScopes: [StringSearchScope] = [.general]
static let openAppWhenRun = true
var criteria: StringSearchCriteria
@MainActor
func perform() async throws -> some IntentResult {
NavigationHandler().to(.search(.init(query: criteria.term)), from: .siri)
return .result()
}
}
Along with this ShortCut in AppShortcutsProvider:
AppShortcut(
intent: MySearchIntent(),
phrases: [
"Search \(.applicationName)"
],
shortTitle: "Search",
systemImageName: "magnifyingglass"
)
Topic:
App & System Services
SubTopic:
General
Tags:
Siri and Voice
Shortcuts
App Intents
Apple Intelligence
We want to do below addition to iOS Mobile App.
Airpod announces Push notification = which is workking
now we want to use voice command that "Reply to this" and sending Reply to that notification but it is saying it is not supported in your app.
So basically we need to use feature - Listen and respond to messages with AirPods
Do we need to add any integration inside app for this or it will directly worked with Siri settings ?
Is it possible to do in non messaging App?
Is it possible to do without syncing contacts ?