Handle requests for your app’s services from users using Siri or Maps.

Posts under SiriKit tag

38 Posts
Sort by:

Post

Replies

Boosts

Views

Activity

Can Apple Watch side SiriKit extension and iPhone app communicate via Watch Connectivity Framework?
Using SiriKit's Car Command Intents (INGetCarLockStatusIntent, INSetCarLockStatusIntent),we are developing SiriKit extension which responds to open and close car door commands from Apple Watch.However, we do not know how to implement the communication between car and Apple Watch. In particular,We do not know whether inter-device communication is possible via "Watch Connectivity Framework".We knows that connection via BLE is not supported from Apple's documents.NG : [Watch SiriKit Extension] --- (BLE) --- [Car]We knows that connection via WIFI is supported from those documents.OK : [Watch SiriKit Extension] --- (HTTP Connection) --- [Car]However, we do not know whether the extension is possible to connect iPhone companion application via "Watch Connectivity Framework" .??? : [Watch SiriKit Extension] --- (Watch Connectivity Framework) --- [iPhone] --- (BLE) --- [Car]We would like to know if there is only a way to connect via WIFI, or even via "Watch Connectivity Framework".
2
0
1.2k
Aug ’23
Siri custom intent and watchOS
Hi there,We're working on a companion watchOS app for our main Application. The main app has already Siri Extension support and it works quite well: we are able to disambiguate our requests and let the INIntent open the main app once we've a correctly identified request.We saw that watchOS also has support for Siri and Siri extensions in a stand-alone way. Our idea was to port most of the work done with iOS onto watchOS.- Let's assume that the working iOS siri extension is called: SiriExtension. It handles quite well the siri interaction from the iPhone and it allows to open our main app into the specified task and/or do some work in the background. No issues here.- Let's assume, also, that the watchOS siri extension we are working on is called: SiriWatchExtension. It should replicate the work done by its counterpart on iOS, but on watchOS and without a direct interaction with the iOS device: just like a stand-alone app.What's unclear to us is how watchOS and Siri extensions works in respect to the counter part on iOS. Specifically:- We all know that, if we exclude some specific special apps, the main interaction with Siri is done with INIntents donations and Shortcut (pretty much like SoupChef example).- We couldn't see a clear way to add Shortcuts from the watchOS app. Seems that the Shortcuts from iOS are "magically" available also on watchOS, but that is somewhat "unidirectional". - The shortcut that we invoke from the watchOS is triggering the IntentHandler from iOS, not the one we built into watchOS. Is this an intended behaviour? Are we missing something?- Is there a way to donate an intent from watchOS to Shortcut and let this intent trigger just only when we're talking with our watch device?- Also, is there a way to complete an action into the watch app other then the iOS app?Many thanks for the patience, and sorry again for the very long post.--c.
2
0
1.2k
Aug ’23
(How) can I use AVAudioRecorder with(in) CarPlay?
Hi!Currently I'm experimenting with building an iOS app that allows a motorist (app user) to do the following;Activate the app by using Siri Shortcut;Open up a specific ViewController within the app;Start a record session automatically and record speech from the user by using the AVAudioRecorder;Stop record session automatically when the users stopped speaking;Upload the recording.Now, I found out that when the user has its iPhone connected to CarPlay, Siri doesn't allow users to open an app that doesn't have CarPlay support. Following this finding I have two questions:Is there a workaround to activate/open an app with Siri, while it doesn't have CarPlay support?Is there a way to use AVAudioRecorder in order to achieve the goal to let the user record his speech, within CarPlay?Thanks!Luc.
1
0
634
Sep ’23
"No options were provided for this parameter" in Edit Widget menu
I had to create a separate thread for the problem I'm facing with WidgetKit. Environment: Xcode 12.0.1 iOS 14.0 App targeting iOS 10 Widget targeting iOS 14 Intents Extension targeting iOS 10 • I have created Intents Extension. • Created Intents Definition file in Widget target, added it to all the three targets (app, widget, intents extension).  • Declared conformance to the intent handling protocol in IntentHandler (Intents Extension). • Set up Intent Timeline Provider in Widget target.  • Added Siri to the app capabilities. If I go to Edit widget -> tap on a dynamic option it says: No options were provided for this parameter. Intents Extension provides data, but I'm not sure how iOS wires the Intents Extension and widget. From what I see I'm sure that my code inside IntentsHandler.swift is never called.
12
1
9.6k
Aug ’23
Supporting In-App SiriKit Media Intents in watchOS
I was trying out SiriKit Media Intents and found that with iOS 14, Media Intents can be handled in-app instead of using an extension. Given my current project structure, in-app intent handling suits my purpose better than handling it in an extension. But my question is about how this intent will be handled in a watchOS app? Is in-app Intent Handling supported on watchOS as well (if yes, are there any examples that I can refer to)? If not, can I create an extension for Media Intents and trigger it for watchOS while triggering the in-app handling for iOS alone? Please share if I have missed to read through some documentation / reference that solves this problem.
1
0
1k
Nov ’23
Attach process failed when trying to run intents extension via Xcode
Issue Summary Hi all, I'm working on an Intents Extension for my app, however when I try to run an intent, Xcode pops up the following error: Could not attach to pid: "965" attach failed (Not allowed to attach to process. Look in the console messages (Console.app), near the debugserver entries, when the attach failed. The subsystem that denied the attach permission will likely have logged an informative message about why it was denied.) An image of the error: This only happens when I try debugging the Intent Extension. Running the main app target or another extension target (e.g. notifications) doesn't produce this error. Build Setup Here are the details of my build setup: Mac Mini M1 Xcode 13 Building to iPhone 11 Pro Max, iOS 15.0.2. I've also tried building to my iPad Pro 12.9 w/ iOS 15.1 and hit the same issue. Things I've tried: Make sure "Debug executable" is unchecked in the scheme I've tried changing the Launch setting to "Automatic" and "Wait for the executable to be launched" I've made sure to run sudo DevToolsSecurity -enable on my mac Rebooted iPhone devices + mac mini Uninstalled / reinstalled the app Deleted derived data Removing / reinstalling the development certs in my keychain --> this actually seemed to work initially, but then the problem came back and now it doesn't work anymore. Console Logs I've looked at the console logs while this error occurs to see if it can shed light on the issue. Here are the ones that seemed notable to me. These logs seem to show that Siri is trying to save / write to a file that it does not have access too. Seems very suspicious error 11:42:38.341470-0800 kernel System Policy: assistantd(31) deny(1) file-read-metadata /private/var/mobile/Library/com.apple.siri.inference error 11:42:38.342204-0800 assistantd failed to save contact runtime data. error=Error Domain=NSCocoaErrorDomain Code=512 "The file “com.apple.siri.inference” couldn’t be saved in the folder “Library”." UserInfo={NSFilePath=/var/mobile/Library/com.apple.siri.inference, NSUnderlyingError=0x100fb03a0 {Error Domain=NSPOSIXErrorDomain Code=5 "Input/output error"}} error 11:42:38.342403-0800 assistantd InferenceError<errorId=crSaveToRunTimeDBFailed file=/Library/Caches/com.apple.xbs/Sources/SiriInference/SiriInference-3100.49.3.1.2/SiriInference/SiriInference/ContactResolver/ContactResolver.swift function=logRunTimeData(runTimeData:config:) line=378 msg=> error 11:42:38.465702-0800 kernel 1 duplicate report for System Policy: assistantd(31) deny(1) file-read-metadata /private/var/mobile/Library/com.apple.siri.inference Looking for "debugserver" entries, like the error suggests, shows these logs: default 11:42:44.814362-0800 debugserver error: [LaunchAttach] MachTask::TaskPortForProcessID task_for_pid(965) failed: ::task_for_pid ( target_tport = 0x0203, pid = 965, &task ) => err = 0x00000005 ((os/kern) failure) default 11:42:44.814476-0800 debugserver 10 +0.011525 sec [03c6/0103]: error: ::task_for_pid ( target_tport = 0x0203, pid = 965, &task ) => err = 0x00000005 ((os/kern) failure) err = ::task_for_pid ( target_tport = 0x0203, pid = 965, &task ) => err = 0x00000005 ((os/kern) failure) (0x00000005) default 11:42:44.825704-0800 debugserver error: MachTask::StartExceptionThread (): task invalid, exception thread start failed. default 11:42:44.825918-0800 debugserver error: [LaunchAttach] END (966) MachProcess::AttachForDebug failed to start exception thread attaching to pid 965: unable to start the exception thread default 11:42:44.826025-0800 debugserver error: Attach failed default 11:42:44.828923-0800 debugserver error: Attach failed: "Not allowed to attach to process. Look in the console messages (Console.app), near the debugserver entries, when the attach failed. The subsystem that denied the attach permission will likely have logged an informative message about why it was denied.". I've also attached the full details of the error below via a text file if it helps. Any help with this issue would be great, and I'm happy to provide more information if needed. Thanks in advance! Xcode Attach Full Error Details
6
2
7.7k
Nov ’23
AppShortcuts limit for 10 shortcuts
Hi, according this WWDC session https://developer.apple.com/wwdc22/10170 App Shortcuts are defined in Swift code, by implementing the AppShortcutsProvider protocol. To implement the protocol, I'll simply create a single getter that returns all the app shortcuts I want to set up for the user. Note that in total, your app can have a maximum of 10 app shortcuts. However, most apps only need a few. there is a limit for up to 10 AppShortcuts. Could you please clarify how that limit handled? 🤔 (e.g. project failed to build / app will crash or malfunction / only 10 shortcuts will be handled on random/ordered choice by iOS) I suppose there is some way to manage shortcuts amount but see no details at documentation yet.
5
1
2.3k
Dec ’23
Xcode 14 Automatic signing failed
I have added Siri capability for an iOS/MacCatalyst app in Xcode. The app compiles just fine for iOS, but when compiling for MacCatalyst I get the error: “/Volumes/xdrive/M2/M2.xcodeproj Provisioning profile "Mac Catalyst Team Provisioning Profile: com.anotherview.M2.mac" doesn't include the com.apple.developer.siri entitlement. ” On the “Signing & Capabilities” page I get the error: “Automatic signing failed Xcode failed to provision this target. Please file a bug report at https://feedbackassistant.apple.com and include the Update Signing report from the Report navigator.” How can I add Siri capabilities on a Mac Catalyst app?
0
0
716
Aug ’23
Is possible to add custom sirikit command to control iOS app?
Hi, I want to integrate the sirikit into my app. Is possible to add custom sirikit command to control iOS app? I figure out the sirikit only support the standard intent. === Standard Intents === Car Commands Lists and Notes Media Messaging Payments Restaurant Reservations Ride Booking VoIP Calling Workouts But I also get that the user could send the command (Navigate to ***) with Google Map by Siri. Does any know how to do that? Please help me or give me some hint? Thanks.
1
0
596
Aug ’23
Siri INStartCallIntent works on iOS but not with CarPlay
Hey there, I implemented Siri and CarPlay. The INStartCallIntent works on iOS but not when initiating a voice command via CarPlay. Error from INIntentDeliverer: Unable to find implementation of resolution method for facade slot name (null) From what I see, I implemented all methods declared on INStartCallIntentHandling but none is called. Does someone know whats missing? 2023-08-29 11:34:52.551834+0200 MyApp[64559:4844776] [Intents] -[INIntentDeliverer _resolveIntentParameter:forIntent:intentHandler:updateIntent:withCompletion:]_block_invoke Unable to find implementation of resolution method for facade slot name (null) on intent <INStartCallIntent: 0x282a71830> {
1
0
682
Oct ’23
Siri app that opens a URL in your devices preferred browser
I am looking to build a siri app that when you say "hey siri [specific phrase]" it lunches a specific URL in the iPhones preferred browser. I know I can use a custom intent to build it, but I am worried that it wont pass an apple code review because its not what siri apps are originally intended to do. I don't want to go all the way down the rabbit hole only to find out that it cannot be published and used by many people. Can any one give me some guidance on this? Thanks
0
0
381
Sep ’23
Programatically check if siri feature is enabled or not in Ipad OS - SiriKit
Hi, I am working on enhancing iPad Application where in I would like to understand if there is a API available for below : Any API provided by Apple to know Siri Settings. (Is Siri Enabled or Disabled) Any API provided by Apple to Enable or Disable Siri from application. I have gone through all the developer documentation available for Sirikit and I don't see any API for my requirement. Technology I prefer : Xamarin.IOS. (I would like to know if API is available in any other language as well) Reason why I am looking for Siri APIS : My application is medical application and there is a chance Siri assistant is invoked accidentally and interferes app functionality. If I am able to access these APIs during app launch, I would notify user to disable Siri for better experience with application. Note : I would like to use these APIS only when my App is up and running. Let me know if you need more details. Thanks
0
0
527
Oct ’23
How to contribute to Journaling Suggestions?
Apple's new Journal app was introduced with the iOS 17.2 beta. In the release notes, the following is mentioned: If your app donates activities or interactions to SiriKit or CallKit or if someone authorizes your app to save data to HealthKit, some data might show up as part of Journaling Suggestions. Is there any documentation on how this works exactly? What kind of activities can be featured in Journal? How does the system decide what to feature? For instance, if I have an app that allows the user to create art images, can I somehow make those images appear in the Journaling Suggestions?
1
7
797
Apr ’24
ITMS-90626: Invalid Siri Support - No example phrase was provided for INSearchForMessagesInten
I've gotten the following error message a few times, does anyone know anything about it? I currently have a WidgetExtension and am suspecting that this is the circle. Your delivery was successful, but you may wish to correct the following issues in your next delivery: ITMS-90626: Invalid Siri Support - No example phrase was provided for INSearchForMessagesIntent in the 'en' language. Please refer to 'https://developer.apple.com/documentation/sirikit/registering_custom_vocabulary_with_sirikit/global_vocabulary_reference/intent_phrases' ITMS-90626: Invalid Siri Support - No example phrase was provided for INSetMessageAttributeIntent in the 'ko' language. Please refer to 'https://developer.apple.com/documentation/sirikit/registering_custom_vocabulary_with_sirikit/global_vocabulary_reference/intent_phrases' ITMS-90626: Invalid Siri Support - No example phrase was provided for INSetMessageAttributeIntent in the 'en' language. Please refer to 'https://developer.apple.com/documentation/sirikit/registering_custom_vocabulary_with_sirikit/global_vocabulary_reference/intent_phrases' ITMS-90626: Invalid Siri Support - No example phrase was provided for INSendMessageIntent in the 'ko' language. Please refer to 'https://developer.apple.com/documentation/sirikit/registering_custom_vocabulary_with_sirikit/global_vocabulary_reference/intent_phrases' ITMS-90626: Invalid Siri Support - No example phrase was provided for INSearchForMessagesIntent in the 'ko' language. Please refer to 'https://developer.apple.com/documentation/sirikit/registering_custom_vocabulary_with_sirikit/global_vocabulary_reference/intent_phrases' ITMS-90626: Invalid Siri Support - No example phrase was provided for INSendMessageIntent in the 'en' language. Please refer to 'https://developer.apple.com/documentation/sirikit/registering_custom_vocabulary_with_sirikit/global_vocabulary_reference/intent_phrases' After you’ve corrected the issues, you can upload a new binary to App Store Connect. Best regards,
0
1
622
Nov ’23
A CarPlay template using Siri to let the user choose between several actions?
I'm developing a CarPlay interface to a messaging application but couldn't find how the root CPTemplate, a grid template with button in my case, could activate SiriKit to let the user choose between several actions like we could see in WhatsApp running on CarPlay: There is CPVoiceControlTemplate that seems to do the job but it is only allowed for navigation app category and not messaging and VoIP. Actually my app could activate Siri to compose a message to a selected contact represented by a CPMessageListItem in a CPListTemplate but I couldn't find how to code a CPGridTemplate that activate Siri...
1
0
548
Nov ’23
INSendMessageIntent has no recipients when replying to a message provided by INSearchForMessagesIntentHandling
INSendMessageIntent has no recipient when replying to a message provided by INSearchForMessagesIntentHandling provider. A user would expect that if Siri just read them a message from an app implementing INSearchForMessagesIntentHandling that they would be able to reply directly without having to look up the recipient. When handling INSearchForMessagesIntentHandling I find the messages in my local DB and create INMessage objects that have INPerson objects embedded in them. We have our own internal contacts, so I fill out the INPerson object as follows: INPerson( personHandle: INPersonHandle(value: "Name", type: .unknown), nameComponents: nil, displayName: "Name", image: nil, contactIdentifier: nil, customIdentifier: "localContactIdentifier" ) After reading every conversation Siri asks "Would you like to reply?", and if the user answers in the affirmative, Siri always answers "To who?" because my INSendMessageIntentHandling.resolveRecipients never gets any recipients. I have attempted to donate all of my contacts using INVocabulary.shared().setVocabulary but that didn't help.
1
0
538
Nov ’23
Merge my application with a scientific data portal.
Hello, let me introduce myself to you, my name is Maxime and I would like to create artificial intelligence for my research and development needs. I want to use Siri applications with data from a scientific portal established by my university. The APIs are public, however, access to the portal and the network is reserved for university researchers. How can you help me?
0
0
508
Nov ’23