Handle requests for your app’s services from users using Siri or Maps.

SiriKit Documentation

Posts under SiriKit tag

70 Posts
Sort by:
Post not yet marked as solved
1 Replies
94 Views
Is it possible to put a pause in Siri's responses to an AppIntent? in the code below id like Siri to pause a bit longer than she does between sentence one and sentence two. i've tried using "..." and multiple "." similar to what she does when she's responding to "how is the market" if you ask a HomePod - she pauses a bit longer between her comments on each market, a bit more than an end of sentence - more like the pause between each point when you are going through a list of bullet points - which is really what i'm using this for. Also is it possible to hide all or part of of the dialog text displayed? so she says it but doesn't show it. I've got a view which shows a better formatted representation of what she says than the text itself. struct TestAppIntent: AppIntent {     static var title: LocalizedStringResource = "A question?"     static var openAppWhenRun: Bool = false     @MainActor     func perform() async throws -> some IntentResult {         return .result(             dialog: "sentence one. sentence two", view: someView()         )     } }
Posted
by ngb.
Last updated
.
Post not yet marked as solved
1 Replies
288 Views
I want to add a feature where a user can type a request using natural language to trigger different actions within my app (eg. "Create an assignment due tomorrow at midnight", "Create a high priority assignment called Final Presentation"). I'd like to leverage SiriKit to process this text, parse the data, and handle the response like I would if the user had asked Siri. First, is this even possible to do? Second, if not what other technologies could I use to parse natural language text?
Posted
by Jspr1776.
Last updated
.
Post marked as solved
3 Replies
110 Views
Hi, I have created my own custom SiriKit Intent, and in the handler I am returning the response code of .continueInApp which opens my app in response. I've worked out that I want to open my app in the background, and .handleInApp seems to be the correct response code to do this. However, .handleInApp is not an option in my intent's enum generated by Xcode. If I look at the auto-generated code, I see .continueInApp, and .success, etc, but no .handleInApp. My Deployment Target is set to iOS 15.5 everywhere that I can find, so I really can't figure out why .handleInApp is not included in the auto-generated code. I've even tried creating a brand new workspace and project totally separate from my main one, and still creating a SiriKit Intent Definition does not generate code that includes .handleInApp. Is there something I need to enable to make .handleInApp appear as an enum option?
Posted
by moontiger.
Last updated
.
Post not yet marked as solved
0 Replies
94 Views
In my VoIP application, users can start a VoIP call with Siri. In my app extension, I have a class conforming to INStartCallIntentHandling, which resolves the contacts, call destination, and call capability of the intent. When the user attempts to start a VoIP call while the device is locked, my app extension is called, and Siri launches my app, starting the VoIP call while the device is still locked. According to the documentation, adding INStartCallIntent to the IntentsRestrictedWhileProtectedDataUnavailable key of my extension's Info.plist should require the user to unlock the device before Siri launches my app and passes it the user activity. This is not working, and I haven't found any way to enforce this behavior. I downloaded Google Voice and WhatsApp, and they require the user to unlock the device when starting a VoIP call with Siri from a locked device. Is there something more I need to do? Here is my Info.plist: <key>NSExtension</key> <dict> <key>NSExtensionAttributes</key> <dict> <key>IntentsRestrictedWhileLocked</key> <array/> <key>IntentsRestrictedWhileProtectedDataUnavailable</key> <array> <string>OpenDialerTabAudioIntent</string> <string>OpenDialerTabVideoIntent</string> <string>OpenFaxTabIntent</string> <string>OpenNewsfeedTabIntent</string> <string>OpenSearchTabIntent</string> <string>INStartCallIntent</string> </array> <key>IntentsSupported</key> <array> <string>OpenDialerTabAudioIntent</string> <string>OpenDialerTabVideoIntent</string> <string>OpenFaxTabIntent</string> <string>OpenNewsfeedTabIntent</string> <string>OpenSearchTabIntent</string> <string>INStartCallIntent</string> </array> </dict> <key>NSExtensionPointIdentifier</key> <string>com.apple.intents-service</string> <key>NSExtensionPrincipalClass</key> <string>$(PRODUCT_MODULE_NAME).IntentHandler</string> </dict> I've read that WhatsApp isn't using CallKit, but I don't know if that is true. Any Help would be appreciated.
Posted Last updated
.
Post not yet marked as solved
0 Replies
98 Views
INIntent Issue In my main app, after I send a message(key code): let inImage = INImage(imageData: imageData) let person = INPerson( personHandle: INPersonHandle( value: personId, type: .emailAddress), nameComponents: nil, displayName: personName, image: inImage, contactIdentifier: nil, customIdentifier: String(chatid) ) let smi = INSendMessageIntent( recipients: persons, outgoingMessageType: .outgoingMessageText, content: nil, speakableGroupName: groupName, conversationIdentifier: "\(chatid)", serviceName: nil, sender: nil, attachments: nil ) smi.setImage(inImage, forParameterNamed: \.speakableGroupName) let interaction = INInteraction(intent: smi, response: nil) interaction.groupIdentifier = String(chatid) interaction.donate { error in } I get INImage instance in share extension by: if let intent = extensionContext?.intent as? INSendMessageIntent, let avatar = intent.keyImage() { // ... } When I call INImage.fetchUIImage method: func loadImageBy(_ inImage: INImage) { imgView.image = nil loadImageIdentifier = UUID() inImage.fetchUIImage { [weak self, loadImageIdentifier] image in guard self?.loadImageIdentifier == loadImageIdentifier else { return } self?.imgView.image = image } } I got this error: Terminating app due to uncaught exception ‘NSInvalidArgumentException’, reason: ‘-[INRemoteImageProxy fetchUIImageWithCompletion:]: unrecognized selector sent to instance 0x2810969e0’
Posted
by lcrystal.
Last updated
.
Post not yet marked as solved
0 Replies
99 Views
Hi, I have a question regarding the integration of the speech to text library called SFSpeechRecognizer. I need SFSpeechRecognizer to recognize terms that are not present in the iOS dictionary like medication names, chemistry terms, etc. I would have to add them, somehow, for SFSpeechRecognizer to be able to recognise them. Is this possible? Thanks
Posted
by bcm1.
Last updated
.
Post not yet marked as solved
0 Replies
134 Views
I've been watching the WWDC videos on the new App Intents framework in iOS. It definitely looks like a nicer API over the Siri Intents framework. However, what's not clear to me is are there any user facing improvements or increased discoverability to it over the existing Siri Intents framework? I'm already indexing my Shortcuts manually whenever the app is opened, which seems like one of the headline features of App Intents. I've got an app that uses Siri Intents quite extensively and I don't really want to have two implementations side by side if there's no tangible benefit. I'd rather just leave the code as is until I can drop iOS 15.
Posted Last updated
.
Post not yet marked as solved
7 Replies
2.0k Views
I had to create a separate thread for the problem I'm facing with WidgetKit. Environment: Xcode 12.0.1 iOS 14.0 App targeting iOS 10 Widget targeting iOS 14 Intents Extension targeting iOS 10 • I have created Intents Extension. • Created Intents Definition file in Widget target, added it to all the three targets (app, widget, intents extension).  • Declared conformance to the intent handling protocol in IntentHandler (Intents Extension). • Set up Intent Timeline Provider in Widget target.  • Added Siri to the app capabilities. If I go to Edit widget -> tap on a dynamic option it says: No options were provided for this parameter. Intents Extension provides data, but I'm not sure how iOS wires the Intents Extension and widget. From what I see I'm sure that my code inside IntentsHandler.swift is never called.
Posted
by Jauzee.
Last updated
.
Post not yet marked as solved
0 Replies
136 Views
Hello everyone, We have a project that is 2 years old and we are trying to implement SiriKit's payment intents. But every time we request Siri to process send a payment, Siri replies that our app isn't configured yet to do so. We tried the same implementation in a new project and it worked fine, Siri sent payments successfully. What are we doing wrong ? We hit Apple Technical Code Level Support but they are asking for a simplified version of our code that adresses the issue, but the problem is that we want to know why it doesn't work within our current app. We have seen that many people experienced the same issue on existing projects in forums but no solution has been found. Thanks for sending us hints.
Posted
by Djipsy6.
Last updated
.
Post not yet marked as solved
0 Replies
144 Views
My app donate shortcut , shortcut type is NSUserActivity, everything work fine in iOS My app also have watch app, the Watch APP has been installed on my watch. When I run this shortcut on apple watch, I am prompted that the App is not installed I am 100% sure the Watch has this app installed How to solve this problem?
Posted
by wlixcc.
Last updated
.
Post not yet marked as solved
0 Replies
134 Views
Hello, I'm trying to add siri support to a video conferencing application. To start out, I would just like siri to respond to a phrase like "Hey Siri, mute my {apps name} video" or "Hey siri, mute my {apps name} audio", and the intent handler can just kick-off an IBAction that mutes the video or audio. Do I need to create custom intents to fulfil this purpose? Or is there a standard intent that may work here.
Posted
by Resonance.
Last updated
.
Post not yet marked as solved
0 Replies
152 Views
Dears, I want to create a Siri intent that calls an API. I would like to have it call the API without opening the App. Is that possible? if it is, how?
Posted
by Ezziddin.
Last updated
.
Post not yet marked as solved
0 Replies
153 Views
How to make a siri query for INSearchForAccountsIntent, so that i could get value in accountNickName in the intent? I have tried different combination of query but accountNickName alone is not being recognized. Example: How much money in my abc checking account? note - abc is the accountNickName, but that is not recognizaed by siri to map for accountNickName. Kindly suggest.
Posted Last updated
.
Post not yet marked as solved
0 Replies
166 Views
I am adding Siri Shortcuts to my navigation app. I understood in order to use Siri voice to trigger shortcuts, for custom intents, the users have to add voice phrases or record voice for shorctus by Add to Siri button or built-in Shortcuts app; for system intents, the users don't need to do that because Siri already knows all the trigger voice phrases. But when I say "Navigate to the station using Google Maps", the whole shortcut works without adding to Siri manullay. And based on https://developer.apple.com/documentation/sirikit, I couldn't find any system intents related to Navigation Domain. Did I misunderstand anything here? How could Siri and Google Maps exchange intents here without system navigation intents or adding to Siri manually?
Posted
by stonezhl.
Last updated
.
Post not yet marked as solved
0 Replies
165 Views
I am working on a Siri integration for a VoIP application. I've added values for both INAlternativeAppName and INAlternativeAppNamePronunciationHint under INAlternativeAppNames in the app target's info.plist. On iOS 14, the phrase "Call [number] using [alternate app name]" launches my app and initiates a VoIP call. on iOS 15, Siri responds with "I don't see an app for that. You'll need to download one." to the same phrase. Is this functionality broken in iOS 15? Here is INAlternativeAppName in my info.plist: <key>INAlternativeAppNames</key> <array> <dict> <key>INAlternativeAppName</key> <string>dialer</string> <key>INAlternativeAppNamePronunciationHint</key> <string>dial er</string> </dict> </array>
Posted Last updated
.
Post not yet marked as solved
1 Replies
208 Views
Hi I am developing an app which will provide near real-time feedback (ie, minimum speed in a turn) to drivers during track events. (Track events put a driver on a race track for either head-to-head racing or for teaching a driver to drive fast and safe on a race track.) I want the feedback to be audio (ie, a voice) rather than a display. But, there are so many options available (Siri, Carplay, notifications, etc) that I don't know what might be best. Right now, I'd just like the app to announce the contents of a UILabel. What would be your recommendations? In the very long term, I'd like the app to respond to driver's voice (for selecting which metric to feedback). My particular model year of car does not support CarPlay, so I'd have to rule CarPlay. (Beyond getting the app to talk, I also have to determine the best way to get the spoken messages to the driver. Do I use Bluetooth to connect to my car's audio, or use the remote audio plug of the car, or attempt to get the audio into the speakers many drivers have in their helmets. But that is a topic for another day...) Thanks for your feedback.
Posted
by tlaritz.
Last updated
.
Post not yet marked as solved
0 Replies
213 Views
I have developed a siri shortcut where user can order coffee using siri voice commands. This shortcut asks users about the payment method to be used while ordering the coffee. Is it possible to integrate support for apple pay while ordering coffee in the siri shortcut. I've been reading documentation but i couldn't find any support in implementing apple pay in siri shortcut. I want to ask if its doable, please provide some help or sample code to achieve it or if its not doable i want to have something to show to my employer.
Posted Last updated
.