Siri and Voice

RSS for tag

Help users quickly accomplish tasks related to your app using just their voice.

Posts under Siri and Voice tag

66 Posts

Post

Replies

Boosts

Views

Activity

App Intent Parameter (AppEntity) not registering
I'm currently testing this on a physical device (12 Pro Max, iOS 26). Through shortcuts, I know for a fact that I am able to successfully trigger the perform code to do what's needed. In addition, if I just tell siri the phrase without my unit parameter, and it asks me which unit, I am able to, once again, successfully call my perform. The problem is any of my phrases that I include my unit, it either just opens my application, or says "I can't understand" Here is my sample code: My Entity: import Foundation import AppIntents struct Unit: Codable, Identifiable { let nickname: String let ipAddress: String let id: String } struct UnitEntity: AppEntity { static var typeDisplayRepresentation: TypeDisplayRepresentation { TypeDisplayRepresentation( name: LocalizedStringResource("Unit", table: "AppIntents") ) } static let defaultQuery = UnitEntityQuery() // Unique Identifer var id: Unit.ID // @Property allows this data to be available to Shortcuts, Siri, Etc. @Property var name: String // By not including @Property, this data is NOT used for queries. var ipAddress: String var displayRepresentation: DisplayRepresentation { DisplayRepresentation( title: "\(name)" ) } init(Unit: Unit) { self.id = Unit.id self.ipAddress = Unit.ipAddress self.name = Unit.nickname } } My Query: struct UnitEntityQuery: EntityQuery { func entities(for identifiers: [UnitEntity.ID]) async throws -> [UnitEntity] { print("[UnitEntityQuery] Query for IDs \(identifiers)") return UnitManager.shared.getUnitUnits() .map { UnitEntity(Unit: $0) } } func suggestedEntities() async throws -> [UnitEntity] { print("[UnitEntityQuery] Request for suggested entities.") return UnitManager.shared.getUnitUnits() .map { UnitEntity(Unit: $0) } } } UnitsManager: class UnitManager { static let shared = UnitManager() private init() {} var allUnits: [UnitEntity] { getUnitUnits().map { UnitEntity(Unit: $0) } } func getUnitUnits() -> [Unit] { guard let jsonString = UserDefaults.standard.string(forKey: "UnitUnits"), let data = jsonString.data(using: .utf8) else { return [] } do { return try JSONDecoder().decode([Unit].self, from: data) } catch { print("Error decoding units: \(error)") return [] } } func contactUnit(unit: UnitEntity) async -> Bool { // Do things here... } } My AppIntent: import AppIntents struct TurnOnUnit: AppIntent { static let title: LocalizedStringResource = "Turn on Unit" static let description = IntentDescription("Turn on an Unit") static var parameterSummary: some ParameterSummary { Summary("Turn on \(\.$UnitUnit)") } @Parameter(title: "Unit Unit", description: "The Unit Unit to turn on") var UnitUnit: UnitEntity func perform() async throws -> some IntentResult & ProvidesDialog { //... My code here } } And my ShortcutProvider: import Foundation import AppIntents struct UnitShortcuts: AppShortcutsProvider { static var appShortcuts: [AppShortcut] { AppShortcut( intent: TurnOnUnit(), phrases: [ "Start an Unit with \(.applicationName)", "Using \(.applicationName), turn on my \(\.$UnitUnit)", "Turn on my \(\.$UnitUnit) with \(.applicationName)", "Start my \(\.$UnitUnit) with \(.applicationName)", "Start \(\.$UnitUnit) with \(.applicationName)", "Start \(\.$UnitUnit) in \(.applicationName)", "Start \(\.$UnitUnit) using \(.applicationName)", "Trigger \(\.$UnitUnit) using \(.applicationName)", "Activate \(\.$UnitUnit) using \(.applicationName)", "Volcano \(\.$UnitUnit) using \(.applicationName)", ], shortTitle: "Turn on unit", systemImageName: "bolt.fill" ) } }
1
0
28
1d
Apple Intelligence language
I found what might be a bug with enabling Apple Intelligence when switching languages. When my iPhone's language is set to Catalan, the Apple Intelligence is disabled because it is not available for that language. Switching to Spanish doesn't activate it, and it still shows the same message of being unavailable, this time saying not available in Spanish (which is not true). However, it is enabled when the phone is rebooted. Once at this point, the bug becomes even weirder. Having the iPhone language set to Spanish and with Apple Intelligence on, I switch the language to Catalan, and the feature remains enabled. After I ask a query in Catalan, it surprisingly understands it and works, but then it gets disabled. Apart from that, as user feedback, I would love to activate Apple Intelligence in an available language other than my device's language. That's how I always used Siri (iPhone in Catalan, Siri in Spanish). Thanks!
2
1
902
6d
Can my users get siri to use my app without specifying the app name?
I have a food logging app. I want my users to be able to say something like: "Hey siri, log chicken and rice for lunch" But appshortcuts provider is forcing me to add the name of the app to the phrase so it becomes: "Hey siri, log chicken and rice for lunch in FoodLogApp". After running a quick survey, I've found that many users dislike having to say the name of the app, it makes it too cumbersome. My question is: Is there a plan from apple 2026 so the users can converse with Siri and apps more naturally without having to say the name of the app? If so, is it already in Beta and can you point me towards it? @available(iOS 17.0, *) struct LogMealIntent: AppIntent { static var title: LocalizedStringResource = "Log Meal" static var description: LocalizedStringResource = "Log a meal" func perform() async throws -> some IntentResult { return .result() } } @available(iOS 17.0, *) struct LogMealShortcutsProvider: AppShortcutsProvider { static var appShortcuts: [AppShortcut] { AppShortcut( intent: LogMealIntent(), phrases: [ "Log chicken and rice for lunch in \(.applicationName)", ], shortTitle: "Log meal", systemImageName: "mic.fill" ) } }
1
0
41
3w
INStartCallIntent requires unlock when device is face down with AirPods
When my Intents extension resolves an INStartCallIntent and returns .continueInApp while the device is locked, the call does not proceed unless the user unlocks the device. After unlocking, the app receives the NSUserActivity and CallKit proceeds normally. My expectation is that the native CallKit outgoing UI should appear and the call should start without requiring unlock — especially when using AirPods, where attention is not available. Steps to Reproduce Pair and connect AirPods. Lock the iPhone. Start music playback (e.g. Apple Music). Place the phone face down (or cover Face ID sensors so attention isn’t available). Say: “Hey Siri, call Tommy with DiscoMonday(My app name).” Observed Behavior Music mutes briefly. Siri says “Calling Tommy with DiscoMonday.” Lock screen shows “Require Face ID / passcode.” After several seconds, music resumes. The app is not launched, no NSUserActivity is delivered, and no CXStartCallAction occurs. With the phone face up, the same phrase launches the app, triggers CXStartCallAction, and the call proceeds via CallKit after faceID. Expected Behavior From the lock screen, Siri should hand off INStartCallIntent to the app, which immediately requests CXStartCallAction and drives the CallKit UI (reportOutgoingCall(...startedConnectingAt:) → ...connectedAt:), without requiring device unlock, regardless of orientation or attention availability when AirPods are connected.
1
0
127
4w
AppIntent, StartWorkoutIntent, and Siri
I'm a bit confused as to what we're supposed to be doing to support starting a workout using Siri in iOS/watchOS 26. On one hand, I see a big push to move towards App Intents and shortcuts rather than SiriKit. On the other hand, I see that some of the things I would expect to work with App Intents well... don't work. BUT - I'm also not sure it isn't just developer error on my part. Here are some assertions that I'm hoping someone more skilled and knowledgable can correct me on: Currently the StartWorkoutIntent only serves the Action button on the Watch Ultra. It cannot be used to register Shortcuts, nor does Siri respond to it. I can use objects inherited from AppIntent to create shortcuts, but this requires an additional permission to run a shortcut if a user starts a workout with Siri. AppIntent shortcuts requires the user to say "Start a workout in " - if the user leaves out the "in " part, Siri will not prompt the user to select my app. If I want to allow the user to simply say "Start a Workout" and have Siri prompt the user for input as to which app it should use, I must currently use the older SiriKit to do so. Are these assertions correct - or am I just implementing something incorrectly? Using the latest Xcode 26 beta for what it is worth.
1
1
106
Aug ’25
How to find Siri response window by bundle id
Hi, experts, I want to find Siri response window by bundle id and use it for checking or printing, here is my example code: XCUIDevice.shared.siriService.activate(voiceRecognitionText: "call mom") let siriApp = XCUIApplication(bundleIdentifier: "***") // Print out text from siriApp, // expecte print: "Sorry, I can't make a phone call with your iphone." Where should I put into ***? I tried "com.apple.SiriViewService", "com.apple.siri.velocity", "com.apple.springboard' but nothing work Any suggestion appreciated, thanks!
1
0
47
Aug ’25
SwiftUI App Intent throws error when using requestDisambiguation with @Parameter property wrapper
I'm implementing an App Intent for my iOS app that helps users plan trip activities. It only works when run as a shortcut but not using voice through Siri. There are 2 issues: The ShortcutsTripEntity will only accept a voice input for a specific trip but not others. I'm stuck with a throwing error when trying to use requestDisambiguation() on the activity day @Parameter property. How do I rectify these issues. This is blocking me from completing a critical feature that lets users quickly plan activities through Siri and Shortcuts. Expected behavior for trip input: The intent should make Siri accept the spoken trip input from any of the options. Actual behavior for trip input: Siri only accepts the same trip when spoken but accepts any when selected by click/touch. Expected behavior for day input: Siri should accept the spoken selected option. Actual behavior for day input: Siri only accepts an input by click/touch but yet throws an error at runtime I'm happy to provide more code. But here's the relevant code: struct PlanActivityTestIntent: AppIntent { @Parameter(title: "Activity Day") var activityDay: ShortcutsItineraryDayEntity @Parameter( title: "Trip", description: "The trip to plan an activity for", default: ShortcutsTripEntity(id: UUID().uuidString, title: "Untitled trip"), requestValueDialog: "Which trip would you like to add an activity to?" ) var tripEntity: ShortcutsTripEntity @Parameter(title: "Activity Title", description: "The title of the activity", requestValueDialog: "What do you want to do or see?") var title: String @Parameter(title: "Activity Day", description: "Activity Day", default: ShortcutsItineraryDayEntity(itineraryDay: .init(itineraryId: UUID(), date: .now), timeZoneIdentifier: "UTC")) var activityDay: ShortcutsItineraryDayEntity func perform() async throws -> some ProvidesDialog { // ...other code... let tripsStore = TripsStore() // load trips and map them to entities try? await tripsStore.getTrips() let tripsAsEntities = tripsStore.trips.map { trip in let id = trip.id ?? UUID() let title = trip.title return ShortcutsTripEntity(id: id.uuidString, title: title, trip: trip) } // Ask user to select a trip. This line would doesn't accept a voice // answer. Why? let selectedTrip = try await $tripEntity.requestDisambiguation( among: tripsAsEntities, dialog: .init( full: "Which of the \(tripsAsEntities.count) trip would you like to add an activity to?", supporting: "Select a trip", systemImageName: "safari.fill" ) ) // This line throws an error let selectedDay = try await $activityDay.requestDisambiguation( among: daysAsEntities, dialog:"Which day would you like to plan an activity for?" ) } } Here are some related images that might help:
0
0
130
Jul ’25
Siri Intent: 'Siri, count for '
Hi, I’m developing an app, which just like Clock App, uses multiple counters. I want to speak Siri commands, such as “Siri, count for one hour”. ‘count’ is the alternative app name. My AppIntent has a parameter, and Siri understands if I say “Siri, count” and asks for duration in a separate step. It runs fine, but I can’t figure out how to run the command with the duration specified upfront, without any subsequent questions from Siri. Clock App has this functionality, so it can be done. //title //perform() @Parameter(title: "Duration") var minutes: Measurement<UnitDuration> } I have a struct ShortcutsProvider: AppShortcutsProvider, phrases accept only parameters of type AppEnum or AppEntity.
1
0
90
Jul ’25
Trouble implementing search via Siri
Hi, we're having trouble implementing search through Siri voice commands. We already did it successfully for audio playback using INPlayMediaIntentHandling. For search, none of the available ways works. Both INSearchForMediaIntentHandling and ShowInAppSearchResultsIntent never open the App in the first place. We tried various commands, but e.g. "Search for " sometimes opens the Apple Music app and sometimes shows a Google search widget. Our app is never taken into consideration for providing any results. We implemented all steps mentioned in WWDC videos and documentation (e.g. https://developer.apple.com/documentation/appintents/making-in-app-search-actions-available-to-siri-and-apple-intelligence), but nothing seems to work. We're mainly testing on iOS 18 currently. Any idea why this is not working?
0
0
49
Jul ’25
View shown by App Intent with ShowsSnippetView doesn't adapt to dark mode
I have an App Intent that conforms to ShowsSnippetView and returns a view that is shown in the Siri interface after the shortcut runs. The view simply consists of a VStack with a Text element, with no special styling. When my device is set to dark mode, the view doesn't adapt: the text is black, but the background of the Siri interface is a transparent dark gray, which makes the text almost unreadable. The text should be white in dark mode. The colorScheme environment value inside the view corresponds to light mode, even though the device is set to dark mode. This is most likely a bug in iOS.
14
0
150
Jun ’25
To say I'm extremely hurt is an understatement! 😔
Voice Control Disabling System Services After Reboot I recently learned from Apple Accessibility Support that the issue I’m experiencing with Voice Control is now affecting multiple users. When I first reported the problem, I appeared to be the first case—what you might call “patient zero.” I have provided extensive feedback and system logs, but now that the issue is more widespread, I have been told that I will not be informed of the cause or notified directly when a fix is found. Instead, updates will be released as solutions are identified, and support staff will not necessarily know the details of the underlying problem. To summarize my experience: after enabling Voice Control and rebooting my MacBook Pro (14.2-inch, M4 chip), critical Apple system services—including FaceTime, Apple Music, and News—stop functioning. Dictation remains available, but it is not as accurate or effective for my needs as Voice Control. I rely on these accessibility features daily due to my disability and cerebral palsy, and this issue has persisted for over five months. I have always valued contributing to the developer program and supporting Apple’s efforts to improve accessibility. However, I find it discouraging that there is no clear communication about the status of this issue or its resolution. My theory is that there may be a hardware interaction—perhaps between the neural engine and the new Wi-Fi chip—rather than a purely software problem. I understand that some information may not be immediately available, but I believe that users who rely on accessibility features should be kept informed about major issues and their progress toward resolution. I appreciate the dedication of the accessibility and development teams, and I want to continue supporting Apple’s mission of inclusion. Thank you for your attention to this matter. Sincerely, Donald Spencer Kirby Dayton, Ohio
2
0
152
Jun ’25
App Shortcuts - No Flexible Matching Assets
My app uses App Intents to create App Shortcuts. When I build and run my app in Xcode, the App Shortcuts Preview tool (under Product menu) shows the following message: No Flexible Matching Assets This target is for a platform which is not supported by Flexible Matching or does not have Flexible Matching enabled. All of my project's targets are iPhone only with a minimum deployment of 18.0. In the build settings for this project, Enable App Shortcuts Flexible Matching is set to Yes. (build settings reference) Any guidance on how to troubleshoot this? Thank you!
0
0
95
Jun ’25
Best Practice for Confirming Siri Shortcuts Availability Before Prompting User Interaction
I'm developing an iOS app that uses Siri Shortcuts to enhance the user experience. Currently, I have implemented functionality that allows users to perform certain actions via Siri Shortcuts. My team wants to improve the user experience by giving an instructional audio prompt (e.g., "say 'hey Siri [action name]' if you want to [perform action]") to users. However, we want to ensure this prompt is only played when the user has already enabled Siri Shortcuts. The challenge is determining whether Siri Shortcuts are properly enabled before suggesting their use. We want to avoid situations where users follow our audio instructions to use Siri, only to discover that Siri Shortcuts aren't properly configured on their device. Since we're using Siri Shortcuts for this feature, the standard requestSiriAuthorization(_:) method doesn't apply to our use case(It said You don’t need to request authorization if your app only supports actions in the Shortcuts app. in https://developer.apple.com/documentation/sirikit/requesting-authorization-to-use-siri). What is the recommended approach to verify that Siri Shortcuts are properly enabled before prompting users to interact with them? Is there a reliable way to check this status(should be the bool value of the toggle in the pic below) programmatically? Thank you for your assistance.
6
0
123
Jun ’25
EntityStringQuery does not show variable menu in Shortcuts app
(Public dupe of FB16477656) The Shortcuts app allows you to parameterise the input for an action using variables or allowing "Ask every time". This option DOES NOT show when conforming my AppEntity.defaultQuery Struct to EntityStringQuery: But it DOES shows when confirming to EntityQuery: As discussed on this forum post (or FB13253161) my AppEntity.defaultQuery HAS TO confirm to EntityStringQuery to allow for searching by String from Siri Voice input. To summarise: With EntityQuery: My Intent looks like it supports variables via the Shortcuts app. But will end up in an endless loop because there is no entities(matching string: String) function. This will allow me to choose an item via the Shorcuts.app UI With EntityStringQuery: My Intent does not support variables via the Shortcuts app. I am not allows to choose an item via the Shorcuts.app UI. Even weirder, if i set up the shortcut with using a build with EntityQuery and then do another build with EntityStringQuery it works as expected. Code: /* Works with Siri to find a match, doesn't show "Ask every time" */ public struct WidgetStationQuery: EntityStringQuery { public init() { } public func entities(matching string: String) async throws -> [Station] { let stations = [Station(id: "car", name: "car"), Station(id: "bike", name: "bike")] return stations.filter { $0.id.lowercased() == string.lowercased() } } public func entities(for identifiers: [Station.ID]) async throws -> [Station] { let stations = [Station(id: "car", name: "car"), Station(id: "bike", name: "bike")] return stations.filter { identifiers.contains($0.id.lowercased()) } } public func suggestedEntities() async throws -> [Station] { return [Station(id: "car", name: "car"), Station(id: "bike", name: "bike")] } public func defaultResult() async -> Station? { try? await suggestedEntities().first } } /* DOES NOT work with Siri to find a match, but Shortcuts shows "Ask every time" */ public struct WidgetBrokenStationQuery: EntityQuery { public init() { } public func entities(matching string: String) async throws -> [Station] { let stations = [Station(id: "car", name: "car"), Station(id: "bike", name: "bike")] return stations.filter { $0.id.lowercased() == string.lowercased() } } public func entities(for identifiers: [Station.ID]) async throws -> [Station] { let stations = [Station(id: "car", name: "car"), Station(id: "bike", name: "bike")] return stations.filter { identifiers.contains($0.id.lowercased()) } } public func suggestedEntities() async throws -> [Station] { return [Station(id: "car", name: "car"), Station(id: "bike", name: "bike")] } public func defaultResult() async -> Station? { try? await suggestedEntities().first } }```
2
0
366
Jun ’25
Phonetic vs Pronunciation contacts fields and Siri
The contacts app has fields for Phonetic and Pronunciation. My app adds phonetic data to the phonetic field to help Siri better understand contacts stored in Greek, Cyrillic, or Georgian. However, using the phonetic field causes the sorting order of contacts to be messed up. For example, Greek B (beta) is represented as a phonetic sound of V, resulting in a completely incorrect sorting order. The pronunciation field doesn’t seem to affect the sorting order, but I’m not sure what it does or should do. My questions are: Do we understand the difference between phonetic and pronunciation, and how Siri actively uses them? If the phonetic field is the correct one to use, how can we raise a feature request with Apple to add an option to sort contacts based on phonetic fields or not? Here’s a test you can try: Create a new contact with the following details: First name: test Last name: test Phonetic first name: Billy Phonetic last name: Idol Ask Siri to show the contact Billy Idol. It will return the “test test” contact. Switch from the phonetic to the pronunciation fields. Now, Siri won’t find Billy Idol.
5
0
96
May ’25
Privacy - Siri Usage Description being reset to default text "Describe why your app needs Siri access" on generating archive
I have an iOS app and that has CarPlay enabled. I have Siri capability and the feature has been tested in Car. The voice commands are working perfectly fine. However, I am facing a weird issue as described below, The key NSSiriUsageDescription, is set to custom text in info.plist. After generating archive, I exported and checked the package contents, in which the the key NSSiriUsageDescription was reset to default text(Describe why your app needs Siri access) in the info.plist. I do not have any dynamic build process that's writing to the info.plist. Only the Siri key is being reset, rest of keys like camera/location permissions are intact. Kindly suggest what needs to be done at my end
0
0
55
May ’25
AppEntity with @Parameter Options Works in Shortcuts App but Not with Siri
I’m working with AppIntents and AppEntity to integrate my app’s data model into Shortcuts and Siri. In the example below, I define a custom FoodEntity and use it as a @Parameter in an AppIntent. I’m providing dynamic options for this parameter via an optionsProvider. In the Shortcuts app, everything works as expected: when the user runs the shortcut, they get a list of food options (from the dynamic provider) to select from. However, in Siri, the experience is different. Instead of showing the list of options, Siri asks the user to say the name of the food, and then tries to match it using EntityStringQuery. I originally assumed this might be a design decision to allow hands-free use with voice, but I found that if you use an AppEnum instead, Siri does present a tappable list of options. So now I’m wondering: why the difference? Is there a way to get the @Parameter with AppEntity + optionsProvider to show a tappable list in Siri like it does in Shortcuts or with an AppEnum? Any clarification on how EntityQuery.suggestedEntities() and DynamicOptionsProvider interact with Siri would be appreciated! struct CaloriesShortcuts: AppShortcutsProvider { static var appShortcuts: [AppShortcut] { AppShortcut( intent: AddCaloriesInteractive(), phrases: [ "Add to \(.applicationName)" ], shortTitle: "Calories", systemImageName: "fork" ) } } struct AddCaloriesInteractive: AppIntent { static var title: LocalizedStringResource = "Add to calories log" static var description = IntentDescription("Add Calories using Shortcuts.") static var openAppWhenRun: Bool = false static var parameterSummary: some ParameterSummary { Summary("Calorie Entry SUMMARY") } var displayRepresentation: DisplayRepresentation { DisplayRepresentation(stringLiteral:"Add to calorie log") } @Dependency private var persistenceManager: PersistenceManager @Parameter(title: LocalizedStringResource("Food"), optionsProvider: FoodEntityOptions()) var foodEntity: FoodEntity @MainActor func perform() async throws -> some IntentResult & ProvidesDialog { return .result(dialog: .init("Added \(foodEntity.name) to calorie log")) } } struct FoodEntity: AppEntity { static var defaultQuery = FoodEntityQuery() @Property var name: String @Property var calories: Int init(name: String, calories: Int) { self.name = name self.calories = calories } static var typeDisplayRepresentation: TypeDisplayRepresentation { TypeDisplayRepresentation(name: "Calorie Entry") } static var typeDisplayName: LocalizedStringResource = "Calorie Entry" var displayRepresentation: AppIntents.DisplayRepresentation { DisplayRepresentation(title: .init(stringLiteral: name), subtitle: "\(calories)") } var id: String { return name } } struct FoodEntityQuery: EntityQuery { func entities(for identifiers: [FoodEntity.ID]) async throws -> [FoodEntity] { var result = [FoodEntity]() for identifier in identifiers { if let entity = FoodDatabase.allEntities().first(where: { $0.id == identifier }) { result.append(entity) } } return result } func suggestedEntities() async throws -> [FoodEntity] { return FoodDatabase.allEntities() } } extension FoodEntityQuery: EntityStringQuery { func entities(matching string: String) async throws -> [FoodEntity] { return FoodDatabase.allEntities().filter({$0.name.localizedCaseInsensitiveCompare(string) == .orderedSame}) } } struct FoodEntityOptions: DynamicOptionsProvider { func results() async throws -> ItemCollection<FoodEntity> { ItemCollection { ItemSection("Section 1") { for entry in FoodDatabase.allEntities() { entry } } } } } struct FoodDatabase { // Fake data static func allEntities() -> [FoodEntity] { [ FoodEntity(name: "Orange", calories: 2), FoodEntity(name: "Banana", calories: 2) ] } }
0
1
68
May ’25
Intents UI Extension automatically dismisses
I am working on implementing a new Intents UI Extension and have noticed that when it is triggered via the "Hey Siri" voice command, the intent dismisses after a few seconds. However, if it is launched from the Shortcuts app, the intent remains active and does not dismiss automatically. Additionally, I’ve observed that this behavior occurs on specific iOS versions, such as 17.5.1 or 17.7. On other versions, like 17.4.1 or 18.4, the intent persists as expected. Does Siri automatically close the intent based on its own logic? Could the iOS version be influencing this behavior? Given the requirement to make the intent persistent, is there any option or configuration available to achieve this?
0
0
49
Apr ’25
Use Siri to control parts of my CarPlay app
I am building a CarPlay navigation app and I would like it to be as hands free as possible. I need a better understanding of how Siri can work with CarPlay and if the direction I need to go is using Intents or App Shortcuts. My goal is to be able to have the user speak to Siri and do things like "open settings" or "zoom in map" and then call a func in my app to do what the user is asking. Does CarPlay support this? Do I need to use App Intents or App Shortcuts or something else?
2
0
78
Apr ’25