Handle requests for your app’s services from users using Siri or Maps.

SiriKit Documentation

Posts under SiriKit tag

71 results found
Sort by:
Post not yet marked as solved
32 Views

What Intents are appropriate for button actions

Hello, I'm trying to add siri support to a video conferencing application. To start out, I would just like siri to respond to a phrase like "Hey Siri, mute my {apps name} video" or "Hey siri, mute my {apps name} audio", and the intent handler can just kick-off an IBAction that mutes the video or audio. Do I need to create custom intents to fulfil this purpose? Or is there a standard intent that may work here.
Asked
by Resonance.
Last updated
.
Post not yet marked as solved
38 Views

Siri Intent

Dears, I want to create a Siri intent that calls an API. I would like to have it call the API without opening the App. Is that possible? if it is, how?
Asked
by Ezziddin.
Last updated
.
Post not yet marked as solved
67 Views

Siri Intent Query for INSearchForAccountsIntent

How to make a siri query for INSearchForAccountsIntent, so that i could get value in accountNickName in the intent? I have tried different combination of query but accountNickName alone is not being recognized. Example: How much money in my abc checking account? note - abc is the accountNickName, but that is not recognizaed by siri to map for accountNickName. Kindly suggest.
Asked Last updated
.
Post not yet marked as solved
81 Views

Questions related to Siri Shortcuts

I am adding Siri Shortcuts to my navigation app. I understood in order to use Siri voice to trigger shortcuts, for custom intents, the users have to add voice phrases or record voice for shorctus by Add to Siri button or built-in Shortcuts app; for system intents, the users don't need to do that because Siri already knows all the trigger voice phrases. But when I say "Navigate to the station using Google Maps", the whole shortcut works without adding to Siri manullay. And based on https://developer.apple.com/documentation/sirikit, I couldn't find any system intents related to Navigation Domain. Did I misunderstand anything here? How could Siri and Google Maps exchange intents here without system navigation intents or adding to Siri manually?
Asked
by stonezhl.
Last updated
.
Post not yet marked as solved
80 Views

Siri not recognizing alternate app name in iOS 15

I am working on a Siri integration for a VoIP application. I've added values for both INAlternativeAppName and INAlternativeAppNamePronunciationHint under INAlternativeAppNames in the app target's info.plist. On iOS 14, the phrase "Call [number] using [alternate app name]" launches my app and initiates a VoIP call. on iOS 15, Siri responds with "I don't see an app for that. You'll need to download one." to the same phrase. Is this functionality broken in iOS 15? Here is INAlternativeAppName in my info.plist: <key>INAlternativeAppNames</key> <array> <dict> <key>INAlternativeAppName</key> <string>dialer</string> <key>INAlternativeAppNamePronunciationHint</key> <string>dial er</string> </dict> </array>
Asked Last updated
.
Post not yet marked as solved
106 Views

Recommendations on getting app to speak to driver

Hi I am developing an app which will provide near real-time feedback (ie, minimum speed in a turn) to drivers during track events. (Track events put a driver on a race track for either head-to-head racing or for teaching a driver to drive fast and safe on a race track.) I want the feedback to be audio (ie, a voice) rather than a display. But, there are so many options available (Siri, Carplay, notifications, etc) that I don't know what might be best. Right now, I'd just like the app to announce the contents of a UILabel. What would be your recommendations? In the very long term, I'd like the app to respond to driver's voice (for selecting which metric to feedback). My particular model year of car does not support CarPlay, so I'd have to rule CarPlay. (Beyond getting the app to talk, I also have to determine the best way to get the spoken messages to the driver. Do I use Bluetooth to connect to my car's audio, or use the remote audio plug of the car, or attempt to get the audio into the speakers many drivers have in their helmets. But that is a topic for another day...) Thanks for your feedback.
Asked
by tlaritz.
Last updated
.
Post not yet marked as solved
129 Views

Apple pay integration in siri shortcut

I have developed a siri shortcut where user can order coffee using siri voice commands. This shortcut asks users about the payment method to be used while ordering the coffee. Is it possible to integrate support for apple pay while ordering coffee in the siri shortcut. I've been reading documentation but i couldn't find any support in implementing apple pay in siri shortcut. I want to ask if its doable, please provide some help or sample code to achieve it or if its not doable i want to have something to show to my employer.
Asked Last updated
.
Post not yet marked as solved
787 Views

Attach process failed when trying to run intents extension via Xcode

Issue Summary Hi all, I'm working on an Intents Extension for my app, however when I try to run an intent, Xcode pops up the following error: Could not attach to pid: "965" attach failed (Not allowed to attach to process. Look in the console messages (Console.app), near the debugserver entries, when the attach failed. The subsystem that denied the attach permission will likely have logged an informative message about why it was denied.) An image of the error: This only happens when I try debugging the Intent Extension. Running the main app target or another extension target (e.g. notifications) doesn't produce this error. Build Setup Here are the details of my build setup: Mac Mini M1 Xcode 13 Building to iPhone 11 Pro Max, iOS 15.0.2. I've also tried building to my iPad Pro 12.9 w/ iOS 15.1 and hit the same issue. Things I've tried: Make sure "Debug executable" is unchecked in the scheme I've tried changing the Launch setting to "Automatic" and "Wait for the executable to be launched" I've made sure to run sudo DevToolsSecurity -enable on my mac Rebooted iPhone devices + mac mini Uninstalled / reinstalled the app Deleted derived data Removing / reinstalling the development certs in my keychain --> this actually seemed to work initially, but then the problem came back and now it doesn't work anymore. Console Logs I've looked at the console logs while this error occurs to see if it can shed light on the issue. Here are the ones that seemed notable to me. These logs seem to show that Siri is trying to save / write to a file that it does not have access too. Seems very suspicious error 11:42:38.341470-0800 kernel System Policy: assistantd(31) deny(1) file-read-metadata /private/var/mobile/Library/com.apple.siri.inference error 11:42:38.342204-0800 assistantd failed to save contact runtime data. error=Error Domain=NSCocoaErrorDomain Code=512 "The file “com.apple.siri.inference” couldn’t be saved in the folder “Library”." UserInfo={NSFilePath=/var/mobile/Library/com.apple.siri.inference, NSUnderlyingError=0x100fb03a0 {Error Domain=NSPOSIXErrorDomain Code=5 "Input/output error"}} error 11:42:38.342403-0800 assistantd InferenceError<errorId=crSaveToRunTimeDBFailed file=/Library/Caches/com.apple.xbs/Sources/SiriInference/SiriInference-3100.49.3.1.2/SiriInference/SiriInference/ContactResolver/ContactResolver.swift function=logRunTimeData(runTimeData:config:) line=378 msg=> error 11:42:38.465702-0800 kernel 1 duplicate report for System Policy: assistantd(31) deny(1) file-read-metadata /private/var/mobile/Library/com.apple.siri.inference Looking for "debugserver" entries, like the error suggests, shows these logs: default 11:42:44.814362-0800 debugserver error: [LaunchAttach] MachTask::TaskPortForProcessID task_for_pid(965) failed: ::task_for_pid ( target_tport = 0x0203, pid = 965, &task ) => err = 0x00000005 ((os/kern) failure) default 11:42:44.814476-0800 debugserver 10 +0.011525 sec [03c6/0103]: error: ::task_for_pid ( target_tport = 0x0203, pid = 965, &task ) => err = 0x00000005 ((os/kern) failure) err = ::task_for_pid ( target_tport = 0x0203, pid = 965, &task ) => err = 0x00000005 ((os/kern) failure) (0x00000005) default 11:42:44.825704-0800 debugserver error: MachTask::StartExceptionThread (): task invalid, exception thread start failed. default 11:42:44.825918-0800 debugserver error: [LaunchAttach] END (966) MachProcess::AttachForDebug failed to start exception thread attaching to pid 965: unable to start the exception thread default 11:42:44.826025-0800 debugserver error: Attach failed default 11:42:44.828923-0800 debugserver error: Attach failed: "Not allowed to attach to process. Look in the console messages (Console.app), near the debugserver entries, when the attach failed. The subsystem that denied the attach permission will likely have logged an informative message about why it was denied.". I've also attached the full details of the error below via a text file if it helps. Any help with this issue would be great, and I'm happy to provide more information if needed. Thanks in advance! Xcode Attach Full Error Details
Asked Last updated
.
Post not yet marked as solved
100 Views

Customize INSendMessageIntent preview in Siri

Hi everyone, I'm implementing a "Send message" feature with Siri using INSendMessageIntentHandling for a messaging application. And I'm using speakableGroupName to help me distinguish between different kinds of contacts. When SiriKit displays the message to send, it shows the following "preview": I would like to know if there is a way to customize what is displayed in the "To" field: ideally, I would like to display the list of recipients (as when speakableGroupName is nil), instead of the speakableGroupName itself. I considered using an Intent UI extension, but this does not seem relevant in this case (as far a I understand it, a Intent UI extension allows to customize the view presented after the execution of handle, not the one before). Thank you in advance for your help!
Asked Last updated
.
Post not yet marked as solved
1.4k Views

INSendMessageIntent suggestions always show "1 person", even for multiple recipients

I'm trying to donate an INSendMessageIntent for multiple recipients in a group. No matter what I do, the share sheet shows "1 Person" and a maximum of 1 profile image. How can I get it to show the correct number of people, and all of the profile images I set? Example screenshot: i.stack.imgur.com/VRf5i.png Sample code follows. Click "Donate Group Without Images" => share sheet suggestion shows 3 bubbles but says "1 Person" Click "Donate Group With Individual Images" => share sheet suggestion shows the first person's image (the letter A), not all 3 images, and still says "1 Person" Click "Donate Group with One Image" (...using undocumented setImage:forParameterNamed:...) => share sheet suggestion correctly shows the single group image, but still says "1 Person". Steps to reproduce: Create a new SwiftUI app with the below code as ContentView.swift. Edit Info.plist to include NSUserActivityTypes: ["INSendMessageIntent"] Add a share extension target; edit the target to add INSendMessageIntent under Supported Intents. Run the app on a device. (Share sheet suggestions don't seem to work in the simulator.) Try clicking the buttons to donate intents, and then clicking the share button to activate the share sheet. ContentView.swift: import SwiftUI import Intents import IntentsUI struct ActivityVC: UIViewControllerRepresentable {   func makeUIViewController(context: Context) -> some UIViewController {     return UIActivityViewController(activityItems: [UIImage(systemName: "photo")!], applicationActivities: nil)   }   func updateUIViewController(_ uiViewController: UIViewControllerType, context: Context) {} } struct ContentView: View {   @State var showActivityVC = false       var body: some View {     VStack(spacing: 24) {       Button("Donate Single Without Image") {         let person1 = INPerson(personHandle: INPersonHandle(value: "Aid", type: .unknown), nameComponents: nil, displayName: "A", image: nil, contactIdentifier: nil, customIdentifier: "Aid")                   let intent = INSendMessageIntent(recipients: [person1], outgoingMessageType: .outgoingMessageText, content: nil, speakableGroupName: INSpeakableString(spokenPhrase: "Single"), conversationIdentifier: "1", serviceName: nil, sender: nil, attachments: nil)                   INInteraction(intent: intent, response: nil).donate { (err) in           print("Donated single without image: \(err as Any)")         }       }       Button("Donate Group Without Images") {         let person1 = INPerson(personHandle: INPersonHandle(value: "Aid", type: .unknown), nameComponents: nil, displayName: "A", image: nil, contactIdentifier: nil, customIdentifier: "Aid")         let person2 = INPerson(personHandle: INPersonHandle(value: "Bid", type: .unknown), nameComponents: nil, displayName: "B", image: nil, contactIdentifier: nil, customIdentifier: "Bid")         let person3 = INPerson(personHandle: INPersonHandle(value: "Cid", type: .unknown), nameComponents: nil, displayName: "B", image: nil, contactIdentifier: nil, customIdentifier: "Cid")                   let intent = INSendMessageIntent(recipients: [person1, person2, person3], outgoingMessageType: .outgoingMessageText, content: nil, speakableGroupName: INSpeakableString(spokenPhrase: "NoImages"), conversationIdentifier: "2", serviceName: nil, sender: nil, attachments: nil)                   INInteraction(intent: intent, response: nil).donate { (err) in           print("Donated group without images: \(err as Any)")         }       }       Button("Donate Group With Individual Images") {         let person1 = INPerson(personHandle: INPersonHandle(value: "Aid", type: .unknown), nameComponents: nil, displayName: "A", image: INImage(uiImage: UIImage(systemName: "a.circle.fill")!), contactIdentifier: nil, customIdentifier: "Aid")         let person2 = INPerson(personHandle: INPersonHandle(value: "Bid", type: .unknown), nameComponents: nil, displayName: "B", image: INImage(uiImage: UIImage(systemName: "b.circle.fill")!), contactIdentifier: nil, customIdentifier: "Bid")         let person3 = INPerson(personHandle: INPersonHandle(value: "Cid", type: .unknown), nameComponents: nil, displayName: "C", image: INImage(uiImage: UIImage(systemName: "c.circle.fill")!), contactIdentifier: nil, customIdentifier: "Cid")                   let intent = INSendMessageIntent(recipients: [person1, person2, person3], outgoingMessageType: .outgoingMessageText, content: nil, speakableGroupName: INSpeakableString(spokenPhrase: "SeparateImages"), conversationIdentifier: "3", serviceName: nil, sender: nil, attachments: nil)                   INInteraction(intent: intent, response: nil).donate { (err) in           print("Donated group with individual images: \(err as Any)")         }       }       Button("Donate Group with One Image") {         let person1 = INPerson(personHandle: INPersonHandle(value: "Aid", type: .unknown), nameComponents: nil, displayName: "A", image: nil, contactIdentifier: nil, customIdentifier: "Aid")         let person2 = INPerson(personHandle: INPersonHandle(value: "Bid", type: .unknown), nameComponents: nil, displayName: "B", image: nil, contactIdentifier: nil, customIdentifier: "Bid")                   let intent = INSendMessageIntent(recipients: [person1, person2], outgoingMessageType: .outgoingMessageText, content: nil, speakableGroupName: INSpeakableString(spokenPhrase: "OneGroupImage"), conversationIdentifier: "4", serviceName: nil, sender: nil, attachments: nil)                   // This "forParameterNamed: \.speakableGroupName" is totally undocumented, but following the example from: https://developer.apple.com/documentation/foundation/app_extension_support/supporting_suggestions_in_your_app_s_share_extension         intent.setImage(INImage(uiImage: UIImage(systemName: "g.circle.fill")!), forParameterNamed: \.speakableGroupName)                   INInteraction(intent: intent, response: nil).donate { (err) in           print("Donated group with one image: \(err as Any)")         }       }       Button("Delete All") {         INInteraction.deleteAll { (err) in           print("Deleted: \(err as Any)")         }       }               Spacer().frame(height: 24)               Button(action: { showActivityVC = true }) {         Image(systemName: "square.and.arrow.up")       }     }     .sheet(isPresented: $showActivityVC) {       ActivityVC()     }   } } struct ContentView_Previews: PreviewProvider {   static var previews: some View {     ContentView()   } }
Asked Last updated
.
Post not yet marked as solved
184 Views

Running Siri Tests on Xcode Cloud?

When I run Siri tests, the very first run requires I tap a button in the simulator to allow the test to use Siri on the device. Is there any way to automatically allow that so I can run completely headless tests?
Asked
by jbuckner.
Last updated
.
Post not yet marked as solved
257 Views

Siri Intent get user location

I tried to use location manager (CLLocationManager) in a custom intent handler, so I can make some calculations based on Latitude and Longitude. But didUpdateLocations method is never called, neither didFailWithError. It is possible to use Location services in an intent used with Siri?
Asked Last updated
.
Post not yet marked as solved
2k Views

UIPasteboard.general no longer accessible from INExtension?

Hello everyone! I have this code in my INExtension: UIPasteboard.general.items.removeAll() Up through iOS 14.5, this worked correctly. However, in 14.5.1 and 14.6 beta 2, the following error message appears in the console, and the system pasteboard remains unchanged. Could not save pasteboard named com.apple.UIKit.pboard.general. Error: Error Domain=PBErrorDomain Code=11 "The pasteboard name com.apple.UIKit.pboard.general is not valid." UserInfo={NSLocalizedDescription=The pasteboard name com.apple.UIKit.pboard.general is not valid.} I filed FB9098625. Is anyone else having this issue?
Asked
by barnard-b.
Last updated
.
Post not yet marked as solved
231 Views

Why Siri Suggestion that show in Photo share sheet limit show 20 count persons for group?

I develop my an IM app for support Siri suggestion feature. I donate a INSendMessageIntent when user send message in group. I create INPerson array by users in that group. But the photo share sheet only show max 20 persons, the trust is I set persons parameters count is 56. Who can explain why limit show max group users count, or my anything wrong? I circle the key point in picture by red pencil. The second and the fourth chat that users in group count more than 20.
Asked
by lcrystal.
Last updated
.
Post not yet marked as solved
234 Views

How can Apple Watch App Action be ran by automation?

I would like to allow users to run an Apple Watch App action by using automations on the iPhone. The action can only be done on the watch and not the companion iPhone app? I noticed that it's possible to start a workout on the default Apple watch app with automations, and also possible to start a workout on the 3rd party Dawn Patrol app using automations. Is it possible to do this with custom intents or NSUserActivities or only the system start workout intent?
Asked Last updated
.