Siri and Voice

RSS for tag

Help users quickly accomplish tasks related to your app using just their voice.

Posts under Siri and Voice tag

45 Posts

Post

Replies

Boosts

Views

Activity

Parameter recognition on AppShortcuts invocation not consistent
While playing around with AppShortcuts I've been encountering some problems around getting the invocation phrase detected and/or the parameter get recognized after invocation phrase via Siri. I've found some solutions or explanations here in other posts (Siri not recognizing the parameter in the phrase & Inform iOS about AppShortcutsProvider), but I still have one issue and it's about consistency. For context, I've defined the parameter to be an AppEntity with it's respective query conforming to the EntityStringQuery Protocol in order to be able to fetch entities with the string given by Siri struct AnIntent: AppIntent { // other parts hidden for clarity @Parameter var entity: ModelEntity } For an invocation phrase akin to "Do something with in ", if the user uses the phrase with a entity previously donated via suggestedEntities() the AppShortcut get executed without problems. If the user uses a phrase with no parameter, like "do something with ", if the user gets asked to input the missing parameter and inputs one, it may or may not get recognized and be asked to input a parameter again, like in a loop. This happens even if the parameter given is one that was donated. I've found that when this happens the entities(matching string: String) function in the EntityQuery doesn't get called. The input can be of one word or sometimes two and it will not be called. So in other words entities(matching string: String) does not get called on every user parameter input Is this behavior correct? Do parameters have some restrictions on length or anything? Does Siri shows the user suggested entities when asked for entity input? It doesn't on my end. Additional question related to AppShortcuts: On AppShortcut definition, where the summary inside the parameter presentation is used? I see that it was defined in the AppIntentsSampleApp for the GetTrailInfo Intent but didn't find where it was used
0
0
112
Apr ’25
Siri Shortcuts of Siri Intent to Voice Control Parts of App
I am new to the idea of Siri Shortcuts and App Intents. What I want to do is use Siri to run a function in my app. Such as saying to Siri Zoom in map and that will then call a function in my app where I can zoom in the map. Similarly, I could say Zoom out map and it would call a function to zoom out my map. I do not need to share any sort of shortcut with the Shortcuts app. Can someone please point me in the right direction for what type of intents I need to use for this?
0
0
220
Apr ’25
Disambiguation for .system.search AppIntent
I'd like to display a list of items to disambiguate for a fulltext search intent. Using the Apple AppIntentsSampleApp, I added TrailSearch.swift: import AppIntents @AssistantIntent(schema: .system.search) struct TrailSearch: AppIntent { static let title: LocalizedStringResource = "Search Trail" static let description = IntentDescription("Search trail by name.", categoryName: "Discover", resultValueName: "Trail") @Parameter(title: "Trail") var criteria: StringSearchCriteria func perform() async throws -> some IntentResult & ReturnsValue<TrailEntity> { if criteria.term.isEmpty { throw $criteria.needsValueError(IntentDialog("need value")) } let trails = TrailDataManager.shared.trails { trail in trail.name.contains(criteria.term) } if trails.count > 1 { throw $criteria.needsDisambiguationError(among: trails.map { StringSearchCriteria(term: $0.name) }) } else if let firstTrail = trails.first { return .result(value: TrailEntity(trail: firstTrail)) } throw $criteria.needsValueError(IntentDialog("Nothing found")) } } Now when I type "trail" which matches several trails and thus lets us enter the disambiguation code path, the Shortcut app just displays the dialog title but no disambiguation items to pick from. Is this by design or a bug? (filed as FB17412220)
0
0
117
Apr ’25
Intents UI Extension automatically dismisses
I am working on implementing a new Intents UI Extension and have noticed that when it is triggered via the "Hey Siri" voice command, the intent dismisses after a few seconds. However, if it is launched from the Shortcuts app, the intent remains active and does not dismiss automatically. Additionally, I’ve observed that this behavior occurs on specific iOS versions, such as 17.5.1 or 17.7. On other versions, like 17.4.1 or 18.4, the intent persists as expected. Does Siri automatically close the intent based on its own logic? Could the iOS version be influencing this behavior? Given the requirement to make the intent persistent, is there any option or configuration available to achieve this?
0
0
124
Apr ’25
Privacy - Siri Usage Description being reset to default text "Describe why your app needs Siri access" on generating archive
I have an iOS app and that has CarPlay enabled. I have Siri capability and the feature has been tested in Car. The voice commands are working perfectly fine. However, I am facing a weird issue as described below, The key NSSiriUsageDescription, is set to custom text in info.plist. After generating archive, I exported and checked the package contents, in which the the key NSSiriUsageDescription was reset to default text(Describe why your app needs Siri access) in the info.plist. I do not have any dynamic build process that's writing to the info.plist. Only the Siri key is being reset, rest of keys like camera/location permissions are intact. Kindly suggest what needs to be done at my end
0
0
240
May ’25
App Shortcuts - No Flexible Matching Assets
My app uses App Intents to create App Shortcuts. When I build and run my app in Xcode, the App Shortcuts Preview tool (under Product menu) shows the following message: No Flexible Matching Assets This target is for a platform which is not supported by Flexible Matching or does not have Flexible Matching enabled. All of my project's targets are iPhone only with a minimum deployment of 18.0. In the build settings for this project, Enable App Shortcuts Flexible Matching is set to Yes. (build settings reference) Any guidance on how to troubleshoot this? Thank you!
0
0
159
Jun ’25
Siri Intent: 'Siri, count for '
Hi, I’m developing an app, which just like Clock App, uses multiple counters. I want to speak Siri commands, such as “Siri, count for one hour”. ‘count’ is the alternative app name. My AppIntent has a parameter, and Siri understands if I say “Siri, count” and asks for duration in a separate step. It runs fine, but I can’t figure out how to run the command with the duration specified upfront, without any subsequent questions from Siri. Clock App has this functionality, so it can be done. //title //perform() @Parameter(title: "Duration") var minutes: Measurement<UnitDuration> } I have a struct ShortcutsProvider: AppShortcutsProvider, phrases accept only parameters of type AppEnum or AppEntity.
1
0
322
Jul ’25
Trouble implementing search via Siri
Hi, we're having trouble implementing search through Siri voice commands. We already did it successfully for audio playback using INPlayMediaIntentHandling. For search, none of the available ways works. Both INSearchForMediaIntentHandling and ShowInAppSearchResultsIntent never open the App in the first place. We tried various commands, but e.g. "Search for " sometimes opens the Apple Music app and sometimes shows a Google search widget. Our app is never taken into consideration for providing any results. We implemented all steps mentioned in WWDC videos and documentation (e.g. https://developer.apple.com/documentation/appintents/making-in-app-search-actions-available-to-siri-and-apple-intelligence), but nothing seems to work. We're mainly testing on iOS 18 currently. Any idea why this is not working?
0
0
249
Jul ’25
SwiftUI App Intent throws error when using requestDisambiguation with @Parameter property wrapper
I'm implementing an App Intent for my iOS app that helps users plan trip activities. It only works when run as a shortcut but not using voice through Siri. There are 2 issues: The ShortcutsTripEntity will only accept a voice input for a specific trip but not others. I'm stuck with a throwing error when trying to use requestDisambiguation() on the activity day @Parameter property. How do I rectify these issues. This is blocking me from completing a critical feature that lets users quickly plan activities through Siri and Shortcuts. Expected behavior for trip input: The intent should make Siri accept the spoken trip input from any of the options. Actual behavior for trip input: Siri only accepts the same trip when spoken but accepts any when selected by click/touch. Expected behavior for day input: Siri should accept the spoken selected option. Actual behavior for day input: Siri only accepts an input by click/touch but yet throws an error at runtime I'm happy to provide more code. But here's the relevant code: struct PlanActivityTestIntent: AppIntent { @Parameter(title: "Activity Day") var activityDay: ShortcutsItineraryDayEntity @Parameter( title: "Trip", description: "The trip to plan an activity for", default: ShortcutsTripEntity(id: UUID().uuidString, title: "Untitled trip"), requestValueDialog: "Which trip would you like to add an activity to?" ) var tripEntity: ShortcutsTripEntity @Parameter(title: "Activity Title", description: "The title of the activity", requestValueDialog: "What do you want to do or see?") var title: String @Parameter(title: "Activity Day", description: "Activity Day", default: ShortcutsItineraryDayEntity(itineraryDay: .init(itineraryId: UUID(), date: .now), timeZoneIdentifier: "UTC")) var activityDay: ShortcutsItineraryDayEntity func perform() async throws -> some ProvidesDialog { // ...other code... let tripsStore = TripsStore() // load trips and map them to entities try? await tripsStore.getTrips() let tripsAsEntities = tripsStore.trips.map { trip in let id = trip.id ?? UUID() let title = trip.title return ShortcutsTripEntity(id: id.uuidString, title: title, trip: trip) } // Ask user to select a trip. This line would doesn't accept a voice // answer. Why? let selectedTrip = try await $tripEntity.requestDisambiguation( among: tripsAsEntities, dialog: .init( full: "Which of the \(tripsAsEntities.count) trip would you like to add an activity to?", supporting: "Select a trip", systemImageName: "safari.fill" ) ) // This line throws an error let selectedDay = try await $activityDay.requestDisambiguation( among: daysAsEntities, dialog:"Which day would you like to plan an activity for?" ) } } Here are some related images that might help:
0
0
306
Jul ’25
Can my users get siri to use my app without specifying the app name?
I have a food logging app. I want my users to be able to say something like: "Hey siri, log chicken and rice for lunch" But appshortcuts provider is forcing me to add the name of the app to the phrase so it becomes: "Hey siri, log chicken and rice for lunch in FoodLogApp". After running a quick survey, I've found that many users dislike having to say the name of the app, it makes it too cumbersome. My question is: Is there a plan from apple 2026 so the users can converse with Siri and apps more naturally without having to say the name of the app? If so, is it already in Beta and can you point me towards it? @available(iOS 17.0, *) struct LogMealIntent: AppIntent { static var title: LocalizedStringResource = "Log Meal" static var description: LocalizedStringResource = "Log a meal" func perform() async throws -> some IntentResult { return .result() } } @available(iOS 17.0, *) struct LogMealShortcutsProvider: AppShortcutsProvider { static var appShortcuts: [AppShortcut] { AppShortcut( intent: LogMealIntent(), phrases: [ "Log chicken and rice for lunch in \(.applicationName)", ], shortTitle: "Log meal", systemImageName: "mic.fill" ) } }
1
0
151
Sep ’25
App Intent Parameter (AppEntity) not registering
I'm currently testing this on a physical device (12 Pro Max, iOS 26). Through shortcuts, I know for a fact that I am able to successfully trigger the perform code to do what's needed. In addition, if I just tell siri the phrase without my unit parameter, and it asks me which unit, I am able to, once again, successfully call my perform. The problem is any of my phrases that I include my unit, it either just opens my application, or says "I can't understand" Here is my sample code: My Entity: import Foundation import AppIntents struct Unit: Codable, Identifiable { let nickname: String let ipAddress: String let id: String } struct UnitEntity: AppEntity { static var typeDisplayRepresentation: TypeDisplayRepresentation { TypeDisplayRepresentation( name: LocalizedStringResource("Unit", table: "AppIntents") ) } static let defaultQuery = UnitEntityQuery() // Unique Identifer var id: Unit.ID // @Property allows this data to be available to Shortcuts, Siri, Etc. @Property var name: String // By not including @Property, this data is NOT used for queries. var ipAddress: String var displayRepresentation: DisplayRepresentation { DisplayRepresentation( title: "\(name)" ) } init(Unit: Unit) { self.id = Unit.id self.ipAddress = Unit.ipAddress self.name = Unit.nickname } } My Query: struct UnitEntityQuery: EntityQuery { func entities(for identifiers: [UnitEntity.ID]) async throws -> [UnitEntity] { print("[UnitEntityQuery] Query for IDs \(identifiers)") return UnitManager.shared.getUnitUnits() .map { UnitEntity(Unit: $0) } } func suggestedEntities() async throws -> [UnitEntity] { print("[UnitEntityQuery] Request for suggested entities.") return UnitManager.shared.getUnitUnits() .map { UnitEntity(Unit: $0) } } } UnitsManager: class UnitManager { static let shared = UnitManager() private init() {} var allUnits: [UnitEntity] { getUnitUnits().map { UnitEntity(Unit: $0) } } func getUnitUnits() -> [Unit] { guard let jsonString = UserDefaults.standard.string(forKey: "UnitUnits"), let data = jsonString.data(using: .utf8) else { return [] } do { return try JSONDecoder().decode([Unit].self, from: data) } catch { print("Error decoding units: \(error)") return [] } } func contactUnit(unit: UnitEntity) async -> Bool { // Do things here... } } My AppIntent: import AppIntents struct TurnOnUnit: AppIntent { static let title: LocalizedStringResource = "Turn on Unit" static let description = IntentDescription("Turn on an Unit") static var parameterSummary: some ParameterSummary { Summary("Turn on \(\.$UnitUnit)") } @Parameter(title: "Unit Unit", description: "The Unit Unit to turn on") var UnitUnit: UnitEntity func perform() async throws -> some IntentResult & ProvidesDialog { //... My code here } } And my ShortcutProvider: import Foundation import AppIntents struct UnitShortcuts: AppShortcutsProvider { static var appShortcuts: [AppShortcut] { AppShortcut( intent: TurnOnUnit(), phrases: [ "Start an Unit with \(.applicationName)", "Using \(.applicationName), turn on my \(\.$UnitUnit)", "Turn on my \(\.$UnitUnit) with \(.applicationName)", "Start my \(\.$UnitUnit) with \(.applicationName)", "Start \(\.$UnitUnit) with \(.applicationName)", "Start \(\.$UnitUnit) in \(.applicationName)", "Start \(\.$UnitUnit) using \(.applicationName)", "Trigger \(\.$UnitUnit) using \(.applicationName)", "Activate \(\.$UnitUnit) using \(.applicationName)", "Volcano \(\.$UnitUnit) using \(.applicationName)", ], shortTitle: "Turn on unit", systemImageName: "bolt.fill" ) } }
1
0
173
Oct ’25
How to record voice, auto-transcribe, translate (auto-detect input language), and play back translated audio on same device in iOS Swift?
Hi everyone 👋 I’m building an iOS app in Swift where I want to do the following: Record the user’s voice Transcribe the spoken sentence (speech-to-text) Auto-detect the spoken language Translate it to another language selected by the user (e.g., English → Spanish or Hindi → English) Speak back (text-to-speech) the translated text on the same device Is this possible to record via phone mic and play the transcribe voice into headphone's audio?
0
0
282
Oct ’25
Siri AppIntent phrasing
When my AppShortcut phrase is: "Go (.$direction) with (.applicationName)" Then everything works correctly, the AppIntent correctly receives the parameter. But when my phrase is: "What is my game (.$direction) with (.applicationName)" The an alert dialog pops up saying: "Hey siri what is my game tomorrow with {app name} Do you want me to use ChatGPT to answer that?" The phrase is obviously heard correctly, and it's exactly what I've specified in the AppShortcut. Why isn't it being sent to my AppIntent? import Foundation import AppIntents @available(iOS 17.0, *) enum Direction: String, CaseIterable, AppEnum { case today, yesterday, tomorrow, next static var typeDisplayRepresentation: TypeDisplayRepresentation { TypeDisplayRepresentation(name: "Direction") } static var caseDisplayRepresentations: [Direction: DisplayRepresentation] = [ .today: DisplayRepresentation(title: "today", synonyms: []), .yesterday: DisplayRepresentation(title: "yesterday", synonyms: []), .tomorrow: DisplayRepresentation(title: "tomorrow", synonyms: []), .next: DisplayRepresentation(title: "next", synonyms: []) ] } @available(iOS 17.0, *) struct MoveItemIntent: AppIntent { static var title: LocalizedStringResource = "Move Item" @Parameter(title: "Direction") var direction: Direction func perform() async throws -> some IntentResult { // Logic to move item in the specified direction print("Moving item \(direction)") return .result() } } @available(iOS 17.0, *) final class MyShortcuts: AppShortcutsProvider { static let shortcutTileColor = ShortcutTileColor.navy static var appShortcuts: [AppShortcut] { AppShortcut( intent: MoveItemIntent() , phrases: [ "Go \(\.$direction) with \(.applicationName)" // "What is my game \(\.$direction) with \(.applicationName)" ] , shortTitle: "Test of direction parameter" , systemImageName: "soccerball" ) } }
3
0
601
Oct ’25
Siri media search unable to provide keyword
Hi, I am developing a music app. We are using siri media search functionality for a while. We recently had a case where siri would not provide keyword for a search. When user speaks "Play Kid songs" (in Turkish, çocuk şarkıları çal), when I debug I see mediaSearch.mediaName is nil. When user speaks "Play Kids" (in Turkish, çocuklar çal) a keyword is given and we can search and play related song. Normally I would think that siri is somehow censoring the word "Kid". But when i try the same voice search in Spotify, I get a children song search result. I've read documentations and searched web but couldnt find any similar experience. What would be the cause, is there an extra setting for this kind of behaviour. What would be the cause or a different capability that Spotify can get a keyword out of this voice search but not us?
0
0
345
Nov ’25
Can Critical Alerts Trigger Text-to-Speech and Vibration in Background & Terminated State?
Hello All, I want to implement Text-to-Speech (TTS) and vibration functionality when a push notification arrives. In my app, I am already using Critical Alerts, and the critical alert sound plays correctly in all app states. However, I need to confirm whether it is possible to trigger Text-to-Speech and custom vibration in all app states: Foreground Background Terminated (killed) state My Questions: Is it technically possible for iOS to run Text-to-Speech (using AVSpeechSynthesizer) when a critical alert notification arrives in background or terminated state? Is it possible to trigger custom vibration patterns from a critical alert when the app is not running? If yes, can someone please provide guidance or sample code on how to implement this? If no, can Apple explain the limitation or provide documentation confirming that TTS and vibration cannot be triggered in background/kill states? What works currently: TTS and vibration only work in foreground when the app is active. Critical alert sound works correctly in all states. I want confirmation on whether iOS supports background/terminated TTS and vibration, or if this is a platform restriction even when using Critical Alerts. Thank you!
1
0
394
Dec ’25
Weird Siri response
I'll ask Siri: What is the weather?" and will get a valid response I'll ask Siri to execute a shortcut my app has created I get "what is the order?" (a phrase nowhere in my app) I'll repeat the question about the weather I now get "what is the order?" ***?
Replies
1
Boosts
0
Views
230
Activity
Apr ’25
How to make siri ask user for inputs programmatically.
I dont know if this is the appropriate forum for this. Answers I've found on the web points me towards intentions, but somehow I couldnt make it work. Im trying to activate siri on carplay to ask user for voice input then make a search. Is this a custom intent capability or is there any other way.
Replies
2
Boosts
0
Views
141
Activity
Apr ’25
Voiceover TextField don't read out all punctuation
I have a TextField and entered for example "sg?!". At the TextField I set the modifier speechAlwaysIncludesPunctuation(). But when I activate VoiceOver the content of TextField is reading. The special characters don't read out. How can I fix this?
Replies
1
Boosts
0
Views
110
Activity
Apr ’25
Parameter recognition on AppShortcuts invocation not consistent
While playing around with AppShortcuts I've been encountering some problems around getting the invocation phrase detected and/or the parameter get recognized after invocation phrase via Siri. I've found some solutions or explanations here in other posts (Siri not recognizing the parameter in the phrase & Inform iOS about AppShortcutsProvider), but I still have one issue and it's about consistency. For context, I've defined the parameter to be an AppEntity with it's respective query conforming to the EntityStringQuery Protocol in order to be able to fetch entities with the string given by Siri struct AnIntent: AppIntent { // other parts hidden for clarity @Parameter var entity: ModelEntity } For an invocation phrase akin to "Do something with in ", if the user uses the phrase with a entity previously donated via suggestedEntities() the AppShortcut get executed without problems. If the user uses a phrase with no parameter, like "do something with ", if the user gets asked to input the missing parameter and inputs one, it may or may not get recognized and be asked to input a parameter again, like in a loop. This happens even if the parameter given is one that was donated. I've found that when this happens the entities(matching string: String) function in the EntityQuery doesn't get called. The input can be of one word or sometimes two and it will not be called. So in other words entities(matching string: String) does not get called on every user parameter input Is this behavior correct? Do parameters have some restrictions on length or anything? Does Siri shows the user suggested entities when asked for entity input? It doesn't on my end. Additional question related to AppShortcuts: On AppShortcut definition, where the summary inside the parameter presentation is used? I see that it was defined in the AppIntentsSampleApp for the GetTrailInfo Intent but didn't find where it was used
Replies
0
Boosts
0
Views
112
Activity
Apr ’25
Siri Shortcuts of Siri Intent to Voice Control Parts of App
I am new to the idea of Siri Shortcuts and App Intents. What I want to do is use Siri to run a function in my app. Such as saying to Siri Zoom in map and that will then call a function in my app where I can zoom in the map. Similarly, I could say Zoom out map and it would call a function to zoom out my map. I do not need to share any sort of shortcut with the Shortcuts app. Can someone please point me in the right direction for what type of intents I need to use for this?
Replies
0
Boosts
0
Views
220
Activity
Apr ’25
Disambiguation for .system.search AppIntent
I'd like to display a list of items to disambiguate for a fulltext search intent. Using the Apple AppIntentsSampleApp, I added TrailSearch.swift: import AppIntents @AssistantIntent(schema: .system.search) struct TrailSearch: AppIntent { static let title: LocalizedStringResource = "Search Trail" static let description = IntentDescription("Search trail by name.", categoryName: "Discover", resultValueName: "Trail") @Parameter(title: "Trail") var criteria: StringSearchCriteria func perform() async throws -> some IntentResult & ReturnsValue<TrailEntity> { if criteria.term.isEmpty { throw $criteria.needsValueError(IntentDialog("need value")) } let trails = TrailDataManager.shared.trails { trail in trail.name.contains(criteria.term) } if trails.count > 1 { throw $criteria.needsDisambiguationError(among: trails.map { StringSearchCriteria(term: $0.name) }) } else if let firstTrail = trails.first { return .result(value: TrailEntity(trail: firstTrail)) } throw $criteria.needsValueError(IntentDialog("Nothing found")) } } Now when I type "trail" which matches several trails and thus lets us enter the disambiguation code path, the Shortcut app just displays the dialog title but no disambiguation items to pick from. Is this by design or a bug? (filed as FB17412220)
Replies
0
Boosts
0
Views
117
Activity
Apr ’25
Intents UI Extension automatically dismisses
I am working on implementing a new Intents UI Extension and have noticed that when it is triggered via the "Hey Siri" voice command, the intent dismisses after a few seconds. However, if it is launched from the Shortcuts app, the intent remains active and does not dismiss automatically. Additionally, I’ve observed that this behavior occurs on specific iOS versions, such as 17.5.1 or 17.7. On other versions, like 17.4.1 or 18.4, the intent persists as expected. Does Siri automatically close the intent based on its own logic? Could the iOS version be influencing this behavior? Given the requirement to make the intent persistent, is there any option or configuration available to achieve this?
Replies
0
Boosts
0
Views
124
Activity
Apr ’25
Privacy - Siri Usage Description being reset to default text "Describe why your app needs Siri access" on generating archive
I have an iOS app and that has CarPlay enabled. I have Siri capability and the feature has been tested in Car. The voice commands are working perfectly fine. However, I am facing a weird issue as described below, The key NSSiriUsageDescription, is set to custom text in info.plist. After generating archive, I exported and checked the package contents, in which the the key NSSiriUsageDescription was reset to default text(Describe why your app needs Siri access) in the info.plist. I do not have any dynamic build process that's writing to the info.plist. Only the Siri key is being reset, rest of keys like camera/location permissions are intact. Kindly suggest what needs to be done at my end
Replies
0
Boosts
0
Views
240
Activity
May ’25
App Shortcuts - No Flexible Matching Assets
My app uses App Intents to create App Shortcuts. When I build and run my app in Xcode, the App Shortcuts Preview tool (under Product menu) shows the following message: No Flexible Matching Assets This target is for a platform which is not supported by Flexible Matching or does not have Flexible Matching enabled. All of my project's targets are iPhone only with a minimum deployment of 18.0. In the build settings for this project, Enable App Shortcuts Flexible Matching is set to Yes. (build settings reference) Any guidance on how to troubleshoot this? Thank you!
Replies
0
Boosts
0
Views
159
Activity
Jun ’25
Siri UI returned to original design
Good morning all has anyone encountered the issue of Siri returning back to her original user interface on IOS-26? I’m trying to figure out the cause. I’ve sent feedback via the feedback app. Just seeing if anyone else has the same issue.
Replies
1
Boosts
0
Views
149
Activity
Jun ’25
Siri Intent: 'Siri, count for '
Hi, I’m developing an app, which just like Clock App, uses multiple counters. I want to speak Siri commands, such as “Siri, count for one hour”. ‘count’ is the alternative app name. My AppIntent has a parameter, and Siri understands if I say “Siri, count” and asks for duration in a separate step. It runs fine, but I can’t figure out how to run the command with the duration specified upfront, without any subsequent questions from Siri. Clock App has this functionality, so it can be done. //title //perform() @Parameter(title: "Duration") var minutes: Measurement<UnitDuration> } I have a struct ShortcutsProvider: AppShortcutsProvider, phrases accept only parameters of type AppEnum or AppEntity.
Replies
1
Boosts
0
Views
322
Activity
Jul ’25
Trouble implementing search via Siri
Hi, we're having trouble implementing search through Siri voice commands. We already did it successfully for audio playback using INPlayMediaIntentHandling. For search, none of the available ways works. Both INSearchForMediaIntentHandling and ShowInAppSearchResultsIntent never open the App in the first place. We tried various commands, but e.g. "Search for " sometimes opens the Apple Music app and sometimes shows a Google search widget. Our app is never taken into consideration for providing any results. We implemented all steps mentioned in WWDC videos and documentation (e.g. https://developer.apple.com/documentation/appintents/making-in-app-search-actions-available-to-siri-and-apple-intelligence), but nothing seems to work. We're mainly testing on iOS 18 currently. Any idea why this is not working?
Replies
0
Boosts
0
Views
249
Activity
Jul ’25
SwiftUI App Intent throws error when using requestDisambiguation with @Parameter property wrapper
I'm implementing an App Intent for my iOS app that helps users plan trip activities. It only works when run as a shortcut but not using voice through Siri. There are 2 issues: The ShortcutsTripEntity will only accept a voice input for a specific trip but not others. I'm stuck with a throwing error when trying to use requestDisambiguation() on the activity day @Parameter property. How do I rectify these issues. This is blocking me from completing a critical feature that lets users quickly plan activities through Siri and Shortcuts. Expected behavior for trip input: The intent should make Siri accept the spoken trip input from any of the options. Actual behavior for trip input: Siri only accepts the same trip when spoken but accepts any when selected by click/touch. Expected behavior for day input: Siri should accept the spoken selected option. Actual behavior for day input: Siri only accepts an input by click/touch but yet throws an error at runtime I'm happy to provide more code. But here's the relevant code: struct PlanActivityTestIntent: AppIntent { @Parameter(title: "Activity Day") var activityDay: ShortcutsItineraryDayEntity @Parameter( title: "Trip", description: "The trip to plan an activity for", default: ShortcutsTripEntity(id: UUID().uuidString, title: "Untitled trip"), requestValueDialog: "Which trip would you like to add an activity to?" ) var tripEntity: ShortcutsTripEntity @Parameter(title: "Activity Title", description: "The title of the activity", requestValueDialog: "What do you want to do or see?") var title: String @Parameter(title: "Activity Day", description: "Activity Day", default: ShortcutsItineraryDayEntity(itineraryDay: .init(itineraryId: UUID(), date: .now), timeZoneIdentifier: "UTC")) var activityDay: ShortcutsItineraryDayEntity func perform() async throws -> some ProvidesDialog { // ...other code... let tripsStore = TripsStore() // load trips and map them to entities try? await tripsStore.getTrips() let tripsAsEntities = tripsStore.trips.map { trip in let id = trip.id ?? UUID() let title = trip.title return ShortcutsTripEntity(id: id.uuidString, title: title, trip: trip) } // Ask user to select a trip. This line would doesn't accept a voice // answer. Why? let selectedTrip = try await $tripEntity.requestDisambiguation( among: tripsAsEntities, dialog: .init( full: "Which of the \(tripsAsEntities.count) trip would you like to add an activity to?", supporting: "Select a trip", systemImageName: "safari.fill" ) ) // This line throws an error let selectedDay = try await $activityDay.requestDisambiguation( among: daysAsEntities, dialog:"Which day would you like to plan an activity for?" ) } } Here are some related images that might help:
Replies
0
Boosts
0
Views
306
Activity
Jul ’25
Can my users get siri to use my app without specifying the app name?
I have a food logging app. I want my users to be able to say something like: "Hey siri, log chicken and rice for lunch" But appshortcuts provider is forcing me to add the name of the app to the phrase so it becomes: "Hey siri, log chicken and rice for lunch in FoodLogApp". After running a quick survey, I've found that many users dislike having to say the name of the app, it makes it too cumbersome. My question is: Is there a plan from apple 2026 so the users can converse with Siri and apps more naturally without having to say the name of the app? If so, is it already in Beta and can you point me towards it? @available(iOS 17.0, *) struct LogMealIntent: AppIntent { static var title: LocalizedStringResource = "Log Meal" static var description: LocalizedStringResource = "Log a meal" func perform() async throws -> some IntentResult { return .result() } } @available(iOS 17.0, *) struct LogMealShortcutsProvider: AppShortcutsProvider { static var appShortcuts: [AppShortcut] { AppShortcut( intent: LogMealIntent(), phrases: [ "Log chicken and rice for lunch in \(.applicationName)", ], shortTitle: "Log meal", systemImageName: "mic.fill" ) } }
Replies
1
Boosts
0
Views
151
Activity
Sep ’25
App Intent Parameter (AppEntity) not registering
I'm currently testing this on a physical device (12 Pro Max, iOS 26). Through shortcuts, I know for a fact that I am able to successfully trigger the perform code to do what's needed. In addition, if I just tell siri the phrase without my unit parameter, and it asks me which unit, I am able to, once again, successfully call my perform. The problem is any of my phrases that I include my unit, it either just opens my application, or says "I can't understand" Here is my sample code: My Entity: import Foundation import AppIntents struct Unit: Codable, Identifiable { let nickname: String let ipAddress: String let id: String } struct UnitEntity: AppEntity { static var typeDisplayRepresentation: TypeDisplayRepresentation { TypeDisplayRepresentation( name: LocalizedStringResource("Unit", table: "AppIntents") ) } static let defaultQuery = UnitEntityQuery() // Unique Identifer var id: Unit.ID // @Property allows this data to be available to Shortcuts, Siri, Etc. @Property var name: String // By not including @Property, this data is NOT used for queries. var ipAddress: String var displayRepresentation: DisplayRepresentation { DisplayRepresentation( title: "\(name)" ) } init(Unit: Unit) { self.id = Unit.id self.ipAddress = Unit.ipAddress self.name = Unit.nickname } } My Query: struct UnitEntityQuery: EntityQuery { func entities(for identifiers: [UnitEntity.ID]) async throws -> [UnitEntity] { print("[UnitEntityQuery] Query for IDs \(identifiers)") return UnitManager.shared.getUnitUnits() .map { UnitEntity(Unit: $0) } } func suggestedEntities() async throws -> [UnitEntity] { print("[UnitEntityQuery] Request for suggested entities.") return UnitManager.shared.getUnitUnits() .map { UnitEntity(Unit: $0) } } } UnitsManager: class UnitManager { static let shared = UnitManager() private init() {} var allUnits: [UnitEntity] { getUnitUnits().map { UnitEntity(Unit: $0) } } func getUnitUnits() -> [Unit] { guard let jsonString = UserDefaults.standard.string(forKey: "UnitUnits"), let data = jsonString.data(using: .utf8) else { return [] } do { return try JSONDecoder().decode([Unit].self, from: data) } catch { print("Error decoding units: \(error)") return [] } } func contactUnit(unit: UnitEntity) async -> Bool { // Do things here... } } My AppIntent: import AppIntents struct TurnOnUnit: AppIntent { static let title: LocalizedStringResource = "Turn on Unit" static let description = IntentDescription("Turn on an Unit") static var parameterSummary: some ParameterSummary { Summary("Turn on \(\.$UnitUnit)") } @Parameter(title: "Unit Unit", description: "The Unit Unit to turn on") var UnitUnit: UnitEntity func perform() async throws -> some IntentResult & ProvidesDialog { //... My code here } } And my ShortcutProvider: import Foundation import AppIntents struct UnitShortcuts: AppShortcutsProvider { static var appShortcuts: [AppShortcut] { AppShortcut( intent: TurnOnUnit(), phrases: [ "Start an Unit with \(.applicationName)", "Using \(.applicationName), turn on my \(\.$UnitUnit)", "Turn on my \(\.$UnitUnit) with \(.applicationName)", "Start my \(\.$UnitUnit) with \(.applicationName)", "Start \(\.$UnitUnit) with \(.applicationName)", "Start \(\.$UnitUnit) in \(.applicationName)", "Start \(\.$UnitUnit) using \(.applicationName)", "Trigger \(\.$UnitUnit) using \(.applicationName)", "Activate \(\.$UnitUnit) using \(.applicationName)", "Volcano \(\.$UnitUnit) using \(.applicationName)", ], shortTitle: "Turn on unit", systemImageName: "bolt.fill" ) } }
Replies
1
Boosts
0
Views
173
Activity
Oct ’25
How to record voice, auto-transcribe, translate (auto-detect input language), and play back translated audio on same device in iOS Swift?
Hi everyone 👋 I’m building an iOS app in Swift where I want to do the following: Record the user’s voice Transcribe the spoken sentence (speech-to-text) Auto-detect the spoken language Translate it to another language selected by the user (e.g., English → Spanish or Hindi → English) Speak back (text-to-speech) the translated text on the same device Is this possible to record via phone mic and play the transcribe voice into headphone's audio?
Replies
0
Boosts
0
Views
282
Activity
Oct ’25
Siri AppIntent phrasing
When my AppShortcut phrase is: "Go (.$direction) with (.applicationName)" Then everything works correctly, the AppIntent correctly receives the parameter. But when my phrase is: "What is my game (.$direction) with (.applicationName)" The an alert dialog pops up saying: "Hey siri what is my game tomorrow with {app name} Do you want me to use ChatGPT to answer that?" The phrase is obviously heard correctly, and it's exactly what I've specified in the AppShortcut. Why isn't it being sent to my AppIntent? import Foundation import AppIntents @available(iOS 17.0, *) enum Direction: String, CaseIterable, AppEnum { case today, yesterday, tomorrow, next static var typeDisplayRepresentation: TypeDisplayRepresentation { TypeDisplayRepresentation(name: "Direction") } static var caseDisplayRepresentations: [Direction: DisplayRepresentation] = [ .today: DisplayRepresentation(title: "today", synonyms: []), .yesterday: DisplayRepresentation(title: "yesterday", synonyms: []), .tomorrow: DisplayRepresentation(title: "tomorrow", synonyms: []), .next: DisplayRepresentation(title: "next", synonyms: []) ] } @available(iOS 17.0, *) struct MoveItemIntent: AppIntent { static var title: LocalizedStringResource = "Move Item" @Parameter(title: "Direction") var direction: Direction func perform() async throws -> some IntentResult { // Logic to move item in the specified direction print("Moving item \(direction)") return .result() } } @available(iOS 17.0, *) final class MyShortcuts: AppShortcutsProvider { static let shortcutTileColor = ShortcutTileColor.navy static var appShortcuts: [AppShortcut] { AppShortcut( intent: MoveItemIntent() , phrases: [ "Go \(\.$direction) with \(.applicationName)" // "What is my game \(\.$direction) with \(.applicationName)" ] , shortTitle: "Test of direction parameter" , systemImageName: "soccerball" ) } }
Replies
3
Boosts
0
Views
601
Activity
Oct ’25
Siri cut user's voice words in German version
My app used app intents. And when user said "Prüfung der Bluetooth Funktion", screen can show the whole words. But in my app, it only can get "Bluetooth Funktion". This behaviour only happened in German version. In English version, everything worked well. Is anyone can support me? Why German version siri cut my words?
Replies
0
Boosts
0
Views
645
Activity
Nov ’25
Siri media search unable to provide keyword
Hi, I am developing a music app. We are using siri media search functionality for a while. We recently had a case where siri would not provide keyword for a search. When user speaks "Play Kid songs" (in Turkish, çocuk şarkıları çal), when I debug I see mediaSearch.mediaName is nil. When user speaks "Play Kids" (in Turkish, çocuklar çal) a keyword is given and we can search and play related song. Normally I would think that siri is somehow censoring the word "Kid". But when i try the same voice search in Spotify, I get a children song search result. I've read documentations and searched web but couldnt find any similar experience. What would be the cause, is there an extra setting for this kind of behaviour. What would be the cause or a different capability that Spotify can get a keyword out of this voice search but not us?
Replies
0
Boosts
0
Views
345
Activity
Nov ’25
Can Critical Alerts Trigger Text-to-Speech and Vibration in Background & Terminated State?
Hello All, I want to implement Text-to-Speech (TTS) and vibration functionality when a push notification arrives. In my app, I am already using Critical Alerts, and the critical alert sound plays correctly in all app states. However, I need to confirm whether it is possible to trigger Text-to-Speech and custom vibration in all app states: Foreground Background Terminated (killed) state My Questions: Is it technically possible for iOS to run Text-to-Speech (using AVSpeechSynthesizer) when a critical alert notification arrives in background or terminated state? Is it possible to trigger custom vibration patterns from a critical alert when the app is not running? If yes, can someone please provide guidance or sample code on how to implement this? If no, can Apple explain the limitation or provide documentation confirming that TTS and vibration cannot be triggered in background/kill states? What works currently: TTS and vibration only work in foreground when the app is active. Critical alert sound works correctly in all states. I want confirmation on whether iOS supports background/terminated TTS and vibration, or if this is a platform restriction even when using Critical Alerts. Thank you!
Replies
1
Boosts
0
Views
394
Activity
Dec ’25