Share intents from within an app to drive system intelligence and show the app's actions in the Shortcuts app.

Posts under Intents tag

49 Posts
Sort by:
Post marked as solved
3 Replies
307 Views
I'm implementing the iOS 16 AppIntents framework and it works fine except when I try to trigger it with Siri, which just pulls up results from the web. Here's a very simple version I made on an empty project. import Foundation import AppIntents @available(iOS 16.0, *) struct ShowMeBooks: AppIntent {     static var openAppWhenRun: Bool = false     static var title: LocalizedStringResource = "Show me my books"          func perform() async throws -> some IntentPerformResult {         let x = 1 + 1         return .finished(dialog: "Here are your books")     } } @available(iOS 16.0, *) struct SouthwestShortcuts: AppShortcutsProvider {     static var appShortcuts: [AppShortcut] {         AppShortcut(             intent: ShowMeBooks(),             phrases: ["Show me my books on \(.applicationName)"]         )     } }
Posted
by F99.
Last updated
.
Post not yet marked as solved
0 Replies
61 Views
I have wondered before how I can find out what messages have been sent and perhaps even to whom and from whom. What is the underlying technology behind the search feature in iOS when the user swipes right from the first screen of the home screen? Is that part of Siri. Setting group Siri and Search together. Does the search feature I speak of use Intents, and is that made accessible to developers. I have also noticed that there is an intent property to the extension context object that passes information between a host app and another app's share extension. I'm brainstorming and looking for any ideas. I hope someone out there have good information for me. macOS has Spotlight. Is that available on iOS?
Posted Last updated
.
Post not yet marked as solved
1 Replies
127 Views
Hi there, I am trying to add a simple intent (just a URL parameter) to my application, and it shows up in the shortcuts app. However, when I try to donate the intent from the app, there is a weird error that shows up and I have been struggling with it for a long time. What even is a shortcut type, isn't a system type considered valid or no? 2022-07-18 17:29:08.342668-0400 Asobi[25910:394998] [Intents] -[INInteraction donateInteractionWithCompletion:]_block_invoke Cannot donate interaction with LoadUrlIntent that has no valid shortcut types Interaction donation failed: %@ Error Domain=IntentsErrorDomain Code=1901 "Cannot donate interaction with intent that has no valid shortcut types: <INInteraction: 0x600000fe1680> { intent = <INIntent: 0x6000019ec870> { }; dateInterval = <_NSConcreteDateInterval: 0x600002b4f260> (Start Date) 2022-07-18 21:29:08 +0000 + (Duration) 0.000000 seconds = (End Date) 2022-07-18 21:29:08 +0000; intentResponse = <null>; groupIdentifier = <null>; intentHandlingStatus = Unspecified; identifier = B3A8CA1F-6CCD-4AC3-8A9F-1D8A1B23834F; direction = Unspecified; } for intent <LoadUrlIntent: 0x6000019e81b0> { url = https://google.com; }" UserInfo={NSLocalizedDescription=Cannot donate interaction with intent that has no valid shortcut types: <INInteraction: 0x600000fe1680> { intent = <INIntent: 0x6000019ec870> { }; dateInterval = <_NSConcreteDateInterval: 0x600002b4f260> (Start Date) 2022-07-18 21:29:08 +0000 + (Duration) 0.000000 seconds = (End Date) 2022-07-18 21:29:08 +0000; intentResponse = <null>; groupIdentifier = <null>; intentHandlingStatus = Unspecified; identifier = B3A8CA1F-6CCD-4AC3-8A9F-1D8A1B23834F; direction = Unspecified; } for intent <LoadUrlIntent: 0x6000019e81b0> { url = https://google.com; }} The intent definitions file is provided in this link (since I cannot attach it) https://gist.github.com/bdashore3/c370bf3459ea78180273c9bf00e0c74f Here is the button code for calling the donation function (which provides the error shown) Button("Donate intent") { let intent = LoadUrlIntent() intent.suggestedInvocationPhrase = "Open google" intent.url = URL(string: "https://google.com") let interaction = INInteraction(intent: intent, response: nil) interaction.donate { (error) in if error != nil { if let error = error as NSError? { print("Interaction donation failed: %@", error) } else { print("Successfully donated interaction") } } } }
Posted
by kingbri.
Last updated
.
Post not yet marked as solved
1 Replies
190 Views
The intent property doesn't even show when I initialize an instance of NSExtensionContent, as in the following code: let extensionContext = NSExtensionContext() extensionContext.intent I get an error saying: Value of type 'NSExtensionContext' has no member 'intent' Why is this? The documentation doesn't say it is deprecated. Even when something is deprecated, it still shows in Xcode.
Posted Last updated
.
Post marked as solved
3 Replies
171 Views
Hi, I have created my own custom SiriKit Intent, and in the handler I am returning the response code of .continueInApp which opens my app in response. I've worked out that I want to open my app in the background, and .handleInApp seems to be the correct response code to do this. However, .handleInApp is not an option in my intent's enum generated by Xcode. If I look at the auto-generated code, I see .continueInApp, and .success, etc, but no .handleInApp. My Deployment Target is set to iOS 15.5 everywhere that I can find, so I really can't figure out why .handleInApp is not included in the auto-generated code. I've even tried creating a brand new workspace and project totally separate from my main one, and still creating a SiriKit Intent Definition does not generate code that includes .handleInApp. Is there something I need to enable to make .handleInApp appear as an enum option?
Posted
by moontiger.
Last updated
.
Post not yet marked as solved
0 Replies
168 Views
I have a custom intent that allows a user to start a meeting - they don’t have to pick up their phone at all to start/join the meeting. This works perfectly on device - the intent is created, confirmed, and goes to the custom intent handle(intent:completion:) function where I call completion(StartMeetingIntentResponse(code: .continueInApp, userActivity: userActivity)). This opens the app and starts the meeting - all hands-free. When my device is connected via CarPlay, this intent can’t be completed. It gets all the way to where I call completion(StartMeetingIntentResponse(code: .continueInApp, userActivity: userActivity)), then Siri responds with “Sorry, I can’t do that while you’re driving.” It never makes it to my app delegate, where I have implemented application(_:userActivity:restorationHandler:) and would expect my app to pick up the activity to start the meeting. Why is this happening? I have implemented the INStartCallIntent similarly - calling completion(INStartCallIntentResponse(code: .continueInApp, userActivity: userActivity)) and that intent works just fine when connected to CarPlay, but my app has a distinction between calling and meeting so I really need both to work.
Posted Last updated
.
Post not yet marked as solved
0 Replies
159 Views
INIntent Issue In my main app, after I send a message(key code): let inImage = INImage(imageData: imageData) let person = INPerson( personHandle: INPersonHandle( value: personId, type: .emailAddress), nameComponents: nil, displayName: personName, image: inImage, contactIdentifier: nil, customIdentifier: String(chatid) ) let smi = INSendMessageIntent( recipients: persons, outgoingMessageType: .outgoingMessageText, content: nil, speakableGroupName: groupName, conversationIdentifier: "\(chatid)", serviceName: nil, sender: nil, attachments: nil ) smi.setImage(inImage, forParameterNamed: \.speakableGroupName) let interaction = INInteraction(intent: smi, response: nil) interaction.groupIdentifier = String(chatid) interaction.donate { error in } I get INImage instance in share extension by: if let intent = extensionContext?.intent as? INSendMessageIntent, let avatar = intent.keyImage() { // ... } When I call INImage.fetchUIImage method: func loadImageBy(_ inImage: INImage) { imgView.image = nil loadImageIdentifier = UUID() inImage.fetchUIImage { [weak self, loadImageIdentifier] image in guard self?.loadImageIdentifier == loadImageIdentifier else { return } self?.imgView.image = image } } I got this error: Terminating app due to uncaught exception ‘NSInvalidArgumentException’, reason: ‘-[INRemoteImageProxy fetchUIImageWithCompletion:]: unrecognized selector sent to instance 0x2810969e0’
Posted
by lcrystal.
Last updated
.
Post not yet marked as solved
1 Replies
237 Views
I have implemented a custom intent that has several parameters. One of the parameters is called "dimensions". I would like to give the user a hint on what to say to respond to the Siri Dialog Prompt for the dimensions parameter. For example, if the Siri Dialog Prompt is "What are the dimensions?", I would like to have a one time prompt that says "What are the dimensions? You can say something like 2 x 4 x 35 inches". Thereafter, I would like to fall back to having Siri only say "What are the dimensions?". Is this possible?
Posted
by jeffb6688.
Last updated
.
Post not yet marked as solved
0 Replies
163 Views
Does anybody knows how to show or enable message button in CallKit screen when user receives incoming call? I tried to search multiple sources but I couldn't find any information about this topic. It never appears in my CallKit app. I can only see Remind Me button.
Posted
by lockSee.
Last updated
.
Post not yet marked as solved
0 Replies
185 Views
I've been watching the WWDC videos on the new App Intents framework in iOS. It definitely looks like a nicer API over the Siri Intents framework. However, what's not clear to me is are there any user facing improvements or increased discoverability to it over the existing Siri Intents framework? I'm already indexing my Shortcuts manually whenever the app is opened, which seems like one of the headline features of App Intents. I've got an app that uses Siri Intents quite extensively and I don't really want to have two implementations side by side if there's no tangible benefit. I'd rather just leave the code as is until I can drop iOS 15.
Posted Last updated
.
Post not yet marked as solved
0 Replies
125 Views
How to establish communication with multiple parameters in single intent with a parent-child relationship? Where the output of the parent parameter will be input for the child parameter.
Posted Last updated
.
Post not yet marked as solved
18 Replies
5k Views
My project has auto-generated Swift code for Intents in my "ProjectName-Swift.h" file. In Xcode 12, this file had zero warnings, but in Xcode 13 it is filled with thousands of warnings. About half of them are Block pointer is missing a nullability type specifier (_Nonnull, _Nullable, or _Null_unspecified) - the other half are Multiple declarations of method 'handleIntent:completion:' found and ignored. I have tried cleaning my project and deleting the Derived Data folder so far, but still getting the same errors. Since these are auto-generated files, I can't go in and fix the errors myself. Has anyone else seen this on the Xcode 13 beta? Or have any idea how to fix it?
Posted Last updated
.
Post not yet marked as solved
0 Replies
290 Views
Hi, I tried to implement the new AppIntents to replace some old shortcuts. I followed the sessions and their examples on how they are implemented. Unfortunately, I have not been able to get an App Intent into the Shortcuts App. I tried it on different apps, new projects and tried multiple changes to the intents. Is there something that I'm missing or is this just a bug in Beta 1? Here's my sample code: import AppIntents struct TestIntent: AppIntent {     static var title: LocalizedStringResource = "Test"     static var description: IntentDescription? = IntentDescription("Test description")          func perform() async throws -> some IntentPerformResult {         .finished(value: "Test")     } } I filed a feedback for this issue: FB10102293 All the best, Alex
Posted
by AlexSFD.
Last updated
.
Post not yet marked as solved
0 Replies
168 Views
Hello everyone, We have a project that is 2 years old and we are trying to implement SiriKit's payment intents. But every time we request Siri to process send a payment, Siri replies that our app isn't configured yet to do so. We tried the same implementation in a new project and it worked fine, Siri sent payments successfully. What are we doing wrong ? We hit Apple Technical Code Level Support but they are asking for a simplified version of our code that adresses the issue, but the problem is that we want to know why it doesn't work within our current app. We have seen that many people experienced the same issue on existing projects in forums but no solution has been found. Thanks for sending us hints.
Posted
by Djipsy6.
Last updated
.
Post marked as solved
1 Replies
264 Views
According to the WWDC19 video (Introducing Parameters for Shortcuts), the parameters are supposed to be resolved in the order you have placed them in the Intents Definition file in Xcode (see timestamp 13:02 through 13:16). In my objective C implementation, this is not happening. I deleted the derivedData and clean the build file, but that did not help. Here is a screenshot of my intents definition parameters: In my implementation, it seems to first process the parameters that do not have "Dynamic Options" check. Then it circles back and works on the ones that have "Dynamic Options". So in my case, it starts with partName, quantity, dimensions, thickness, width, and length. Then it works on partsListName. Furthermore, while the "Disambiguation Prompt" is spoken/written, the "Disambiguation Introduction" is NOT spoken/written. Is this a bug that is causing the parameters to be resolved in the wrong order, or do I need to do something differently to force it to resolve parameters in the order that I need it to go in? And are the "Disambiguation Introduction" supposed to work?
Posted
by jeffb6688.
Last updated
.
Post not yet marked as solved
0 Replies
153 Views
Hello, I'm trying to add siri support to a video conferencing application. To start out, I would just like siri to respond to a phrase like "Hey Siri, mute my {apps name} video" or "Hey siri, mute my {apps name} audio", and the intent handler can just kick-off an IBAction that mutes the video or audio. Do I need to create custom intents to fulfil this purpose? Or is there a standard intent that may work here.
Posted
by Resonance.
Last updated
.
Post not yet marked as solved
0 Replies
204 Views
We have a number of shortcuts that build a HTML with JS in a Text action. We then base64 encode that Text action and then use the URL action to parse it as "data:text/html;base64, Base64 Encoded". We then open the URL with the Open action. This works perfect in iOS on iphone and Ipad. Now that macOS supports Shortcuts we tried our automations and all of them fail to open the URL. The error we get is "Shortcuts could not open the app for the URL scheme "data" because the app is not installed on this device Any ideas would be greatly appreciated
Posted
by jkoen.
Last updated
.
Post not yet marked as solved
0 Replies
179 Views
Dears, I want to create a Siri intent that calls an API. I would like to have it call the API without opening the App. Is that possible? if it is, how?
Posted
by Ezziddin.
Last updated
.
Post marked as solved
1 Replies
185 Views
Hello guys, My question is quite simple, is it possible to share a singleton class between my app and my app extension, which is actually a SiriExtension ? I've already found "app groups" capability but it doesn't seem to work (probably because the singleton comes from outside?). I use a framework in my app and I want to use the same framework in my app extension, and of course the same singleton (same instance). For clarification, I would like Siri to know if the user is logged in to perform the intent in question. If not, Siri would ask the user to log in before performing an intent. An other example if its not very clear : it is actually necessary to "activate" the framework before using it. The thing is that even if my app has activated the framework, the extension doesn't know it.
Posted
by Bbluxe.
Last updated
.