Handle requests for your app’s services from users using Siri or Maps.

SiriKit Documentation

Posts under SiriKit tag

70 Posts
Sort by:
Post not yet marked as solved
1 Replies
331 Views
I tried to use location manager (CLLocationManager) in a custom intent handler, so I can make some calculations based on Latitude and Longitude. But didUpdateLocations method is never called, neither didFailWithError. It is possible to use Location services in an intent used with Siri?
Posted
by
Post not yet marked as solved
0 Replies
297 Views
I develop my an IM app for support Siri suggestion feature. I donate a INSendMessageIntent when user send message in group. I create INPerson array by users in that group. But the photo share sheet only show max 20 persons, the trust is I set persons parameters count is 56. Who can explain why limit show max group users count, or my anything wrong? I circle the key point in picture by red pencil. The second and the fourth chat that users in group count more than 20.
Posted
by
Post not yet marked as solved
0 Replies
289 Views
I would like to allow users to run an Apple Watch App action by using automations on the iPhone. The action can only be done on the watch and not the companion iPhone app? I noticed that it's possible to start a workout on the default Apple watch app with automations, and also possible to start a workout on the 3rd party Dawn Patrol app using automations. Is it possible to do this with custom intents or NSUserActivities or only the system start workout intent?
Posted
by
Post not yet marked as solved
0 Replies
295 Views
I have a watch app which communicates with the iPhone app fine. But when running the WCSession from the Intent extension (used for Siri) I'm getting following in the console logs: Error Domain=WCErrorDomain Code=7018 "Companion app is not installed." Can the Watch Intent extension communicate with iPhone app - if so how can I enable it?
Posted
by
Post not yet marked as solved
1 Replies
341 Views
I have an app that lets users log drinks. Users create CoreData entries that have attributes like drink type and quantity. I have a view that lets users the quantity they have consumed in the current day. This is done using a FetchRequest and a calendar predicate that only fetches entries from the current day. I am using @FetchRequest property wrapper in the view, then initializing it with init() {}. Inside the initializer, I set the predicate, sort descriptors, and use self._todayIntakeEvents ... to set the value for the fetch request variable. Then, each time the view appears, I use .onAppear to sum the quantity of the fetched items by using todayIntakeEvents.map, then using .reduce to sum the array. The problem: when users add drinks with Siri, the drink information doesn't show up when opening the app from background. However, it will appear if I quit and relaunch the app. What I want to happen is for the drink to appear immediately when the user opens the app again. My code for the view: struct TodaySummaryView: View { @Environment(\.managedObjectContext) private var viewContext     @FetchRequest var todayIntakeEvents: FetchedResults<WaterIntakeEvent> init() {         let calendar = Calendar.current         let dateFrom = calendar.startOfDay(for: Date())         let dateTo = calendar.date(byAdding: .day, value: 1, to: dateFrom)         let predicateTodayEvents = NSPredicate(format: "timeOfConsumption <= %@ AND timeOfConsumption >= %@", dateTo! as CVarArg, dateFrom as CVarArg)         self._todayIntakeEvents = FetchRequest(             entity: WaterIntakeEvent.entity(),             sortDescriptors: [NSSortDescriptor(keyPath: \WaterIntakeEvent.timeOfConsumption, ascending: false)],             predicate: predicateTodayEvents)     } var body: some View { // some UI .onAppear { totalWaterQuantityToday = todayIntakeEvents.map {Int($0.waterQuantity)}             sumOfWaterQuantityToday = totalWaterQuantityToday.reduce(0, +) print(sumOfWaterQuantityToday) // even after Siri creates new CoreData entries, this still prints the quantity from before it added } My code for the Siri Intent Handler: public func handle(intent: LogDrinkIntent, completion: @escaping (LogDrinkIntentResponse) -> Swift.Void) {         let context = PersistenceController.shared.container.viewContext         let newDrink = WaterIntakeEvent(context: context)         newDrink.drinkType = intent.drinkType         newDrink.quantity = intent.quantity as! Float         newDrink.waterQuantity = intent.quantity as! Float         newDrink.timeOfConsumption = Date()         newDrink.id = UUID()         do {             try context.save()             let response = LogDrinkIntentResponse(code: LogDrinkIntentResponseCode.success, userActivity: nil)             completion(response)             print("Successfully saved.")         } catch {             completion(LogDrinkIntentResponse(code: LogDrinkIntentResponseCode.failure, userActivity: nil))             print("Error saving.")         }     } I suspect the problem is not Siri (because the data appears after relaunching the app) but that because I initialized the view with the FetchRequest, it's not responding to changes. Can someone help please?
Posted
by
Post not yet marked as solved
0 Replies
258 Views
Siri doesn't give the option to select the account when we have multiple accounts in a particular category(for eg. Credit card). In case of multiple credit cards when we ask "What is my credit card balance?", Siri just displays a message - "You may find that information in the app" instead of showing the list of accounts to select. However all the speakable strings are returned in the resolveAccountNickName() method. // Here speakables is the list of account names to be displayed INSpeakableStringResolutionResult.disambiguation(with: speakables) Note: It works perfectly fine if you have only one account in a category. Also, the same scenario works fine in lower os versions (< iOS 15) We use INSearchForAccountsIntent.
Posted
by
Post not yet marked as solved
0 Replies
420 Views
Hi there, I'm implementing a custom IntentHandler, which has a parameter. This parameter is an array of strings. I'd like to handle the case when user didn't specify the value for this parameter. I want to provide a value calculated using my business logic and ask the user to confirm that value. To do this, I use the resolveParameter function and return the following result: [INStringResolutionResult.confirmationRequired(with: someSuggestedValue)] Unfortunately, Siri throws an error with the message: "Uh oh, there's a problem. Please try again." using debug I caught the error, here's the error message: libc++abi: terminating with uncaught exception of type NSException Terminating app due to uncaught exception 'NSInternalInconsistencyException', reason: 'Unexpected class Swift.__SwiftDeferredNSArray' terminating with uncaught exception of type NSException Here's a part of my code. func resolveParameter(for intent: MyCustomIntent) async -> [INStringResolutionResult] { if intent.arrayOfMyCustomParameters?.isEmpty ?? true { // Users did not specify the value intent.arrayOfMyCustomParameters = [someSuggestedValue] return [INStringResolutionResult.confirmationRequired(with: someSuggestedValue)] } else { var result = [INStringResolutionResult]()             for value in intent.arrayOfMyCustomParameters {                 result.append(.success(with: value))             }             return result } } p.s. This problem occurs only using the shortcut from the Siri voice assistant for iOS 15.0.0+ devices. Not sure if this is helpful, but here is the stack trace. libobjc.A.dylib`objc_exception_throw: -> 0x198f37670 <+0>: stp x28, x27, [sp, #-0x40]! 0x198f37674 <+4>: stp x22, x21, [sp, #0x10] 0x198f37678 <+8>: stp x20, x19, [sp, #0x20] 0x198f3767c <+12>: stp x29, x30, [sp, #0x30] 0x198f37680 <+16>: add x29, sp, #0x30 ; =0x30 0x198f37684 <+20>: sub sp, sp, #0xfc0 ; =0xfc0 0x198f37688 <+24>: mov x20, x0 0x198f3768c <+28>: mov w0, #0x20 0x198f37690 <+32>: bl 0x19902ea70 ; __cxa_allocate_exception 0x198f37694 <+36>: mov x19, x0 0x198f37698 <+40>: adrp x8, 372380 0x198f3769c <+44>: ldr x8, [x8, #0xb68] 0x198f376a0 <+48>: mov x0, x20 0x198f376a4 <+52>: blr x8 0x198f376a8 <+56>: mov x20, x0 0x198f376ac <+60>: adrp x8, 13442 0x198f376b0 <+64>: add x1, x8, #0x959 ; =0x959 0x198f376b4 <+68>: bl 0x198f24040 ; objc_msgSend 0x198f376b8 <+72>: str x20, [x19] 0x198f376bc <+76>: adrp x8, 342105 0x198f376c0 <+80>: add x8, x8, #0x78 ; =0x78 0x198f376c4 <+84>: add x8, x8, #0x10 ; =0x10 0x198f376c8 <+88>: mov x21, x19 0x198f376cc <+92>: str x8, [x21, #0x8]! 0x198f376d0 <+96>: mov x0, x20 0x198f376d4 <+100>: bl 0x198f37270 ; object_getClassName 0x198f376d8 <+104>: str x0, [x19, #0x10] 0x198f376dc <+108>: cbnz x20, 0x198f376e8 ; <+120> 0x198f376e0 <+112>: mov x8, #0x0 0x198f376e4 <+116>: b 0x198f376f4 ; <+132> 0x198f376e8 <+120>: tbnz x20, #0x3f, 0x198f3779c ; <+300> 0x198f376ec <+124>: ldr x8, [x20] 0x198f376f0 <+128>: and x8, x8, #0xffffffff8 0x198f376f4 <+132>: str x8, [x19, #0x18] 0x198f376f8 <+136>: adrp x22, 377914 0x198f376fc <+140>: ldrb w8, [x22, #0x2c6] 0x198f37700 <+144>: cbz w8, 0x198f37720 ; <+176> 0x198f37704 <+148>: mov x0, x20 0x198f37708 <+152>: bl 0x198f37270 ; object_getClassName 0x198f3770c <+156>: stp x20, x0, [sp, #0x8] 0x198f37710 <+160>: str x19, [sp] 0x198f37714 <+164>: adrp x0, 28 0x198f37718 <+168>: add x0, x0, #0xae9 ; =0xae9 0x198f3771c <+172>: bl 0x198f4957c ; _objc_inform 0x198f37720 <+176>: adrp x8, 377914 0x198f37724 <+180>: ldrb w8, [x8, #0x2c7] 0x198f37728 <+184>: cbz w8, 0x198f37780 ; <+272> 0x198f3772c <+188>: ldrb w8, [x22, #0x2c6] 0x198f37730 <+192>: cbnz w8, 0x198f37750 ; <+224> 0x198f37734 <+196>: mov x0, x20 0x198f37738 <+200>: bl 0x198f37270 ; object_getClassName 0x198f3773c <+204>: stp x20, x0, [sp, #0x8] 0x198f37740 <+208>: str x19, [sp] 0x198f37744 <+212>: adrp x0, 28 0x198f37748 <+216>: add x0, x0, #0xae9 ; =0xae9 0x198f3774c <+220>: bl 0x198f4957c ; _objc_inform 0x198f37750 <+224>: add x0, sp, #0x20 ; =0x20 0x198f37754 <+228>: mov w1, #0x1f4 0x198f37758 <+232>: bl 0x1941a3b34 0x198f3775c <+236>: mov x22, x0 0x198f37760 <+240>: adrp x8, 342104 0x198f37764 <+244>: ldr x8, [x8, #0xd18] 0x198f37768 <+248>: ldr x0, [x8] 0x198f3776c <+252>: bl 0x1941a3df8 0x198f37770 <+256>: mov x2, x0 0x198f37774 <+260>: add x0, sp, #0x20 ; =0x20 0x198f37778 <+264>: mov x1, x22 0x198f3777c <+268>: bl 0x198f4a8cc ; symbol stub for: backtrace_symbols_fd 0x198f37780 <+272>: mov x0, x20 0x198f37784 <+276>: nop 0x198f37788 <+280>: adrp x2, 3 0x198f3778c <+284>: add x2, x2, #0xac0 ; =0xac0 0x198f37790 <+288>: mov x0, x19 0x198f37794 <+292>: mov x1, x21 0x198f37798 <+296>: bl 0x198f4a7d0 ; symbol stub for: __cxa_throw 0x198f3779c <+300>: adrp x8, 372380 0x198f377a0 <+304>: add x8, x8, #0xae0 ; =0xae0 0x198f377a4 <+308>: and x9, x20, #0x7 0x198f377a8 <+312>: ldr x8, [x8, x9, lsl #3] 0x198f377ac <+316>: adrp x9, 372380 0x198f377b0 <+320>: add x9, x9, #0x258 ; =0x258 0x198f377b4 <+324>: cmp x8, x9 0x198f377b8 <+328>: b.ne 0x198f376f4 ; <+132> 0x198f377bc <+332>: ubfx x8, x20, #55, #8 0x198f377c0 <+336>: adrp x9, 372380 0x198f377c4 <+340>: add x9, x9, #0x2e0 ; =0x2e0 0x198f377c8 <+344>: ldr x8, [x9, x8, lsl #3] 0x198f377cc <+348>: b 0x198f376f4 ; <+132> 0x198f377d0 <+352>: udf #0x0 0x198f377d4 <+356>: udf #0x0
Posted
by
Post not yet marked as solved
1 Replies
730 Views
Hello, I have created a simple SwiftUI app with Core Data and want to be able to add data via the shortcuts app, I have implemented Intents and the IntentHandler class. When I create a shortcut to add data to my app and run it, nothing happens in the app, the list does not refresh, the only way to see the added data is to close the app completely and reopen it. How can I refresh the UI immediately? I will post my Core Data stack and my SwiftUI view. struct PersistenceController { static let shared = PersistenceController() let container: NSPersistentContainer init() { container = NSPersistentContainer(name: "SiriShort") guard let fileContainer = FileManager.default.containerURL(forSecurityApplicationGroupIdentifier: "group.SiriShortcut2")?.appendingPathComponent("SiriShort.sqlite") else { fatalError("Shared file container could not be created.") } let storeDescription = NSPersistentStoreDescription(url: fileContainer) storeDescription.setOption(true as NSNumber, forKey: NSPersistentHistoryTrackingKey) storeDescription.setOption(true as NSNumber, forKey: NSPersistentStoreRemoteChangeNotificationPostOptionKey) container.persistentStoreDescriptions = [storeDescription] container.loadPersistentStores(completionHandler: { (storeDescription, error) in if let error = error as NSError? { fatalError("Unresolved error \(error), \(error.userInfo)") } }) container.viewContext.automaticallyMergesChangesFromParent = true container.viewContext.mergePolicy = NSMergeByPropertyStoreTrumpMergePolicy } } View: import SwiftUI import CoreData import Intents struct ContentView: View { @Environment(\.managedObjectContext) private var viewContext @State private var view: Bool = false @FetchRequest( sortDescriptors: [NSSortDescriptor(keyPath: \Item.text, ascending: true)], animation: .default) private var items: FetchedResults<Item> var body: some View { NavigationView { List { ForEach(items) { item in Text(item.text!) } .onDelete(perform: deleteItems) } .toolbar { ToolbarItem(placement: .navigationBarTrailing) { EditButton() } ToolbarItem { Button(action: addItem) { Label("Add Item", systemImage: "plus") } } } } } private func addItem() { withAnimation { let newItem = Item(context: viewContext) newItem.text = "\(Int.random(in: 0...1000))" do { try viewContext.save() } catch { // Replace this implementation with code to handle the error appropriately. // fatalError() causes the application to generate a crash log and terminate. You should not use this function in a shipping application, although it may be useful during development. let nsError = error as NSError fatalError("Unresolved error \(nsError), \(nsError.userInfo)") } makeDonation(text: newItem.text!) } } func makeDonation(text: String) { let intent = MakeUppercaseIntent() intent.text = text intent.unit = "1" intent.suggestedInvocationPhrase = "Add \(text) to app" let interaction = INInteraction(intent: intent, response: nil) interaction.donate { (error) in if error != nil { if let error = error as NSError? { print("Donation failed: %@" + error.localizedDescription) } } else { print("Successfully donated interaction") } } } private func deleteItems(offsets: IndexSet) { withAnimation { offsets.map { items[$0] }.forEach(viewContext.delete) do { try viewContext.save() } catch { // Replace this implementation with code to handle the error appropriately. // fatalError() causes the application to generate a crash log and terminate. You should not use this function in a shipping application, although it may be useful during development. let nsError = error as NSError fatalError("Unresolved error \(nsError), \(nsError.userInfo)") } } } }
Posted
by
Post not yet marked as solved
1 Replies
290 Views
I want to add a feature where a user can type a request using natural language to trigger different actions within my app (eg. "Create an assignment due tomorrow at midnight", "Create a high priority assignment called Final Presentation"). I'd like to leverage SiriKit to process this text, parse the data, and handle the response like I would if the user had asked Siri. First, is this even possible to do? Second, if not what other technologies could I use to parse natural language text?
Posted
by
Post not yet marked as solved
0 Replies
353 Views
I'm trying to break through focus mode in my app that receives VOIP notifications. I've added a particular contact to the "allowlist" for a particular focus mode. When I receive a call from my app, I create an INStartCallIntent. To create the INPerson, I use an INPersonHandle with the value being the identifier that I fetch from the contact. There are also 2 other identifier fields in INPerson that are both nullable (contactIdentifier and customIdentifier). I've also tried populating those with the fetched contact's identifier. Here's my general prototyped code (yes I know this is all hard coded, but it's just a prototype for now).      INPersonHandle *handle = [[INPersonHandle alloc] initWithValue:identifier type:INPersonHandleTypeUnknown];     INPerson *caller = [[INPerson alloc] initWithPersonHandle:handle                           nameComponents:nameComponents                            displayName:name                               image:nil                         contactIdentifier:identifier                          customIdentifier:nil                        isContactSuggestion:NO                           suggestionType:INPersonSuggestionTypeSocialProfile];     NSDate *_Nonnull dateCreated = [NSDate date];     INCallRecord *const callRecord = [[INCallRecord alloc] initWithIdentifier:callId                                    dateCreated:dateCreated                                   callRecordType:INCallRecordTypeRinging                                   callCapability:callCapability                                    callDuration:nil                                       unseen:nil                                    participants:@[caller]                                   numberOfCalls:@(1)                                 isCallerIdBlocked:nil];     INStartCallIntent *const intent = [[INStartCallIntent alloc] initWithCallRecordFilter:nil                                      callRecordToCallBack:callRecord                                           audioRoute:INCallAudioRouteSpeakerphoneAudioRoute                                        destinationType:INCallDestinationTypeNormal                                            contacts:@[caller]                                         callCapability:callCapability];     // Donate the interaction to Siri     INInteraction *const interaction = [[INInteraction alloc] initWithIntent:intent response:nil];     interaction.direction = INInteractionDirectionIncoming;     [interaction donateInteractionWithCompletion:nil]; Nothing I've tried yet has been able to break through focus mode and I think it's because the identifiers aren't matching up. Apple seems to have a clear way to match contacts for email address and phone number identifiers, but it's unclear how they match based on an unknown type for the INPersonHandleType.
Posted
by
Post not yet marked as solved
0 Replies
466 Views
Hi there! I've create a configurable widget with a custom intent. I want to assign a default value for the intent's parameters, based on the user selection from the app - so when the widget is created it'll use this config initially. For that I've created an intent handler with defaultParemeterName methods - and it seems to work well when a widget is created for the first time. But for every widget that is created afterwards, the config remains stuck on the initial value that was returned from the methods, even if the user has changed the values in the app. It seems like the methods still being called and return the correct values, but the new widgets won't use it for their default configuration - instead, it will use the initial values that was returned from the default methods. Why is it happening? Is there a way to make it use the new values?
Posted
by
Post not yet marked as solved
1 Replies
360 Views
Hello I created a simple SwiftUI app with Core Data and I want to be able to add data via the shortcuts app, I created a shortcut that takes some text as input and returns it in uppercase and when I run the shortcut in the shortcuts app, it works, however when I added an "add" function (to save data in the Core Data database) to the intent handle function, and I run it again nothing is saved in the app, here is the code: class MakeUppercaseIntentHandler: NSObject, MakeUppercaseIntentHandling { let persistenceController = PersistenceController() func handle(intent: MakeUppercaseIntent, completion: @escaping (MakeUppercaseIntentResponse) -> Void) { if let inputText = intent.text { let uppercaseText = inputText.uppercased() completion(MakeUppercaseIntentResponse.success(result: add(text: uppercaseText))) } else { completion(MakeUppercaseIntentResponse.failure(error: "The text entred is invalid")) } } func resolveText(for intent: MakeUppercaseIntent, with completion: @escaping (MakeUppercaseTextResolutionResult) -> Void) { if let text = intent.text, !text.isEmpty { completion(MakeUppercaseTextResolutionResult.success(with: text)) } else { completion(MakeUppercaseTextResolutionResult.unsupported(forReason: .noText)) } } func add(text: String) -> String{ let newItem = Item(context: persistenceController.container.viewContext) newItem.text = text do { try persistenceController.container.viewContext.save() } catch { let nsError = error as NSError fatalError("Unresolved error \(nsError), \(nsError.userInfo)") } return text } } Thank You
Posted
by
Post not yet marked as solved
1 Replies
299 Views
Hi, I'm fairly new to app development and I've stumbled upon an issue with an app I have created and deployed to the App Store already. The issue: The app has a widget that uses Siri Intents with configurable parameters. It allows the user to choose an option from a list that is dynamically generated. On the App Store version the user is not allowed to choose, however in the simulator everything works fine. What would be the issue? Many thanks! Cristian Lăpușan
Posted
by
Post not yet marked as solved
0 Replies
250 Views
Firstly I'm talking about the new iOS15 feature Focus, replacing Do Not Disturb. I could only find APIs for getting it on iOS locally but I'd like a solution that's cross-platform. It uses iCloud to sync the Focus between devices (Like iPhone and Mac), so I'm wondering if there's a way to get the user's currently set Focus at whatever is synced at their iCloud. Or even setting an intent to get Focus update events. My app is web-based so it's unnecessary to have an iPhone app just for Focus access. Any help would be lovely!
Posted
by
Post not yet marked as solved
0 Replies
214 Views
I have a live music app to play live recordings from artists like the Grateful Dead and I'd like to add SiriKit support to it. When users refer to "albums" in my app, they would likely say "show", "recording", or "concert". I've been testing INPlayMediaIntent and when I say "Play an album by the Grateful Dead in Live Music Archive", Siri returns the proper INMediaSearch with mediaType = album and artistName = Grateful Dead. When I say "Play a show/recording by the Grateful Dead in Live Music Archive", Siri sets the mediaName = "Grateful Dead" and nothing else. concert is a bit better, but still not very useful. Is it possible to configure synonyms for "album" so when Siri parses the speech, "show" / "concert" / "recording" will all resolve to "album"?
Posted
by
Post marked as solved
2 Replies
841 Views
The IntentHandler in my objective-C implementation of a custom intent fails to receive a call from a voice activated shortcut. When using Siri to invoke the donated interaction, I have observed that I receive the following errors in the Console app that claim the intent handler method for intent is unimplemented: -[INIntentDeliverer _invokeIntentHandlerMethodForIntent:intentHandler:parameterNamed:keyForSelectors:executionHandler:unimplementedHandler:] _invokeIntentHandlerMethodForIntent sirikit.intent.voice_commands.RunVoiceCommandIntent -[WFRVCIntentHandler stateMachineForIntent:] Created state machine <WFRVCStateMachine: 0x102e23970 state=WaitingForServer phase=Unknown> for intent with identifier 8A87FC68-329D-49FF-B534-B0A5821854CA -[INIntentDeliverer _invokeIntentHandlerMethodForIntent:intentHandler:parameterNamed:keyForSelectors:executionHandler:unimplementedHandler:] _invokeIntentHandlerMethodForIntent sirikit.intent.voice_commands.RunVoiceCommandIntent This error is consistent with the fact that an attempt to trigger the custom intent with a voice command results in iOS calling my appDelegate, and in particular the application:continueUserActivity:restorationHandler:. According to the documentation, the restorationHandler should only be called if the intent is not handled and must be handled by the main app. As there is very little documentation for an objective-C implementation, I cannot figure out what I am missing. I have tried to map the sample SoupChef app implementation of Siri Shortcuts to my implementation. I cannot figure out where I am going wrong. Here is my implementation (sorry for all the details, but I am hoping you can see something wrong): First, I have implemented two additional targets; a Shared Framework and an Intents Extension. I have also implemented an Intents Definition File. Here is an image of my targets: W_P_r is the main app, W_P_rKit is the shared framework, and PartsListManagerIntents is the Intents Extension. Next, here is my Intents Definition file and the target membership that it belongs to: I have also added an app group to the capabilities section of the add for both the main target and the PartsListIntentManager target. And I added Siri capability to the main target. All of this auto-creates some code, including a default IntentHandler.m and an info.plist in the PartsListManagerIntents target. I have updated the info.plist as follows: And here is the Auto-generated IntentHandler (which I have modified to log activity and to call a specific intent handler that resides in the W_P_rKit shared framework: #import "IntentHandler.h" #import <Intents/Intents.h> #import <W_P_rKit/W_P_rKit.h> #import "CreatePartsListIntentHandler.h" #import "P__tHandler.h" #import <os/log.h> @interface IntentHandler () /* <CreatePartsListIntentHandling, P__tHandling> */ @end @implementation IntentHandler - (id)handlerForIntent:(INIntent *)intent { os_log_with_type(OS_LOG_DEFAULT, OS_LOG_TYPE_DEBUG, "handlerForIntent: Reached IntentHandler."); if ([intent.identifier isEqualToString:@"P__rIntent"]) { NSLog(@"P__rIntent"); return [[P__rIntentHandler alloc] init]; } else if ([intent.identifier isEqualToString:@"CreatePartsListIntent"]) { NSLog(@"CreatePartsListIntent"); os_log_with_type(OS_LOG_DEFAULT, OS_LOG_TYPE_DEBUG, "handlerForIntent: IntentHandler Received CreatePartsListIntent."); return [[CreatePartsListIntentHandler alloc] init]; } return self; } Note that CreatePartsListIntentHandler is a class that implements the CreatePartsListIntentHandling protocol (resolve, confirm, and handle methods of the IntentHandler). Now here is the relevant implementation that should trigger iOS to call the IntentHandler: In my app at the point where the user fills in the name of a new project I make a call to donate the interaction as follows: CreatePartsListIntent *data = [[CreatePartsListIntent alloc] init]; data.projectName = [projectPlistName copy]; data.quantity = [NSNumber numberWithInteger : currentProjectQuantity]; [[W_P_rDonationManager sharedInstance] donateCreatePartsListIntent : data]; The call to donateCreatePartsListIntent does the following: data.suggestedInvocationPhrase = @"Create Parts List"; INInteraction* interaction = [[INInteraction alloc] initWithIntent:data response:nil]; [interaction donateInteractionWithCompletion:^(NSError * _Nullable error) { ... } Once the user has created the empty parts list (forcing the above interaction donation to occur), the view controller will present an "Add Siri Shortcut" button. The tapping of the button automatically calls the following method to create a shortcut: -(void) addCreatePartsListShortcutWasTapped { NSUserActivity *userActivity = [[W_P_rDonationManager sharedInstance] CreatePartsListShortcut]; INShortcut *shortcut = [[INShortcut alloc] initWithUserActivity:userActivity]; INUIAddVoiceShortcutViewController *addSiri = [[INUIAddVoiceShortcutViewController alloc] initWithShortcut:shortcut]; addSiri.delegate = self; [self presentViewController:addSiri animated:YES completion: nil]; } The call to CreatePartsListShortcut does the following: NSUserActivity *newActivity = [[NSUserActivity alloc] initWithActivityType: kW_P_rCreatePartsListActivityType]; newActivity.persistentIdentifier = kW_P_rCreatePartsListActivityType; newActivity.eligibleForSearch = TRUE; newActivity.eligibleForPrediction = TRUE; CSSearchableItemAttributeSet *attributeSet = [[CSSearchableItemAttributeSet alloc] initWithContentType: UTTypeImage]; newActivity.title = @"Create Parts List"; attributeSet.contentDescription = @"Create a parts list for a new project"; newActivity.suggestedInvocationPhrase = @"Create Parts List"; UIImage *image = [UIImage imageNamed:@"W_P_r_Icon.jpg"]; NSData *imageData = [NSData dataWithData:UIImagePNGRepresentation(image)]; attributeSet.thumbnailData = imageData; newActivity.contentAttributeSet = attributeSet; return newActivity; This does create a shortcut that is visible in the shortcuts app. Clicking on the shortcut or saying the invocation phrase will take you directly to the the app delegate's application:userActivity:restorationHandler. But it does not call the IntentHandler. I know this because I have implemented logs that would tell me if the execution thread came there. Why is my IntentHandler not being called? Why is iOS sending the error message _invokeIntentHandlerMethodForIntent:intentHandler:parameterNamed:keyForSelectors:executionHandler:unimplementedHandler:? I have actually been struggling with this for weeks. Any help or hints would be so helpful.
Posted
by
Post not yet marked as solved
0 Replies
261 Views
I have a SiriKit extension in my iOS app that uses the INSearchForAccountsIntent to display account balances via the Siri interface.When the user clicks one of the accounts in the list, Siri will automatically show the account detail screen.Can we block user from going into the detail screen by code?
Posted
by
Post not yet marked as solved
2 Replies
1k Views
Issue Summary Hi all, I'm working on an Intents Extension for my app, however when I try to run an intent, Xcode pops up the following error: Could not attach to pid: "965" attach failed (Not allowed to attach to process. Look in the console messages (Console.app), near the debugserver entries, when the attach failed. The subsystem that denied the attach permission will likely have logged an informative message about why it was denied.) An image of the error: This only happens when I try debugging the Intent Extension. Running the main app target or another extension target (e.g. notifications) doesn't produce this error. Build Setup Here are the details of my build setup: Mac Mini M1 Xcode 13 Building to iPhone 11 Pro Max, iOS 15.0.2. I've also tried building to my iPad Pro 12.9 w/ iOS 15.1 and hit the same issue. Things I've tried: Make sure "Debug executable" is unchecked in the scheme I've tried changing the Launch setting to "Automatic" and "Wait for the executable to be launched" I've made sure to run sudo DevToolsSecurity -enable on my mac Rebooted iPhone devices + mac mini Uninstalled / reinstalled the app Deleted derived data Removing / reinstalling the development certs in my keychain --> this actually seemed to work initially, but then the problem came back and now it doesn't work anymore. Console Logs I've looked at the console logs while this error occurs to see if it can shed light on the issue. Here are the ones that seemed notable to me. These logs seem to show that Siri is trying to save / write to a file that it does not have access too. Seems very suspicious error 11:42:38.341470-0800 kernel System Policy: assistantd(31) deny(1) file-read-metadata /private/var/mobile/Library/com.apple.siri.inference error 11:42:38.342204-0800 assistantd failed to save contact runtime data. error=Error Domain=NSCocoaErrorDomain Code=512 "The file “com.apple.siri.inference” couldn’t be saved in the folder “Library”." UserInfo={NSFilePath=/var/mobile/Library/com.apple.siri.inference, NSUnderlyingError=0x100fb03a0 {Error Domain=NSPOSIXErrorDomain Code=5 "Input/output error"}} error 11:42:38.342403-0800 assistantd InferenceError<errorId=crSaveToRunTimeDBFailed file=/Library/Caches/com.apple.xbs/Sources/SiriInference/SiriInference-3100.49.3.1.2/SiriInference/SiriInference/ContactResolver/ContactResolver.swift function=logRunTimeData(runTimeData:config:) line=378 msg=> error 11:42:38.465702-0800 kernel 1 duplicate report for System Policy: assistantd(31) deny(1) file-read-metadata /private/var/mobile/Library/com.apple.siri.inference Looking for "debugserver" entries, like the error suggests, shows these logs: default 11:42:44.814362-0800 debugserver error: [LaunchAttach] MachTask::TaskPortForProcessID task_for_pid(965) failed: ::task_for_pid ( target_tport = 0x0203, pid = 965, &task ) => err = 0x00000005 ((os/kern) failure) default 11:42:44.814476-0800 debugserver 10 +0.011525 sec [03c6/0103]: error: ::task_for_pid ( target_tport = 0x0203, pid = 965, &task ) => err = 0x00000005 ((os/kern) failure) err = ::task_for_pid ( target_tport = 0x0203, pid = 965, &task ) => err = 0x00000005 ((os/kern) failure) (0x00000005) default 11:42:44.825704-0800 debugserver error: MachTask::StartExceptionThread (): task invalid, exception thread start failed. default 11:42:44.825918-0800 debugserver error: [LaunchAttach] END (966) MachProcess::AttachForDebug failed to start exception thread attaching to pid 965: unable to start the exception thread default 11:42:44.826025-0800 debugserver error: Attach failed default 11:42:44.828923-0800 debugserver error: Attach failed: "Not allowed to attach to process. Look in the console messages (Console.app), near the debugserver entries, when the attach failed. The subsystem that denied the attach permission will likely have logged an informative message about why it was denied.". I've also attached the full details of the error below via a text file if it helps. Any help with this issue would be great, and I'm happy to provide more information if needed. Thanks in advance! Xcode Attach Full Error Details
Posted
by
Post marked as Apple Recommended
342 Views
Currently, Apple documentation states that INSearchCallHistory intent is one of the required intents needed for CarPlay to be supported for the app. Sources for that are: This WWDC video ~13:27 timestamp https://developer.apple.com/videos/play/wwdc2017/719 The CarPlay App Programming Guide, page 6 at the top https://developer.apple.com/carplay/documentation/CarPlay-App-Programming-Guide.pdf However, the INSearchCallHistoryIntent is currently deprecated with a notice that it's going to be removed: https://developer.apple.com/documentation/sirikit/insearchcallhistoryintenthandling I don't particularly want to do work that's just going to have to be removed later on. Is this still a requirement for CarPlay? Is there an updated list of required intents needed to support CarPlay?
Posted
by
Post not yet marked as solved
2 Replies
397 Views
I have downloaded wwdc20-10073 Recipe Assistant project to play with it because I need to solve with Siri a similar business problem but the program doesn't work. I ran it with the default scheme Siri Intent Query (show directions for recepe on Recepe Assistant). I get an error from Siri "I can't get directions using Recipe Assistant. Sorry about that." Can you please help what I am doing wrong. I run the project in Xcode 13 and macOS Monterey 12.0.1
Posted
by