- iOS 11.0+
- Xcode 10.1+
SiriKit delivers the user’s spoken search criteria to your app in the form of an
INSearch object. This object contains search criteria derived from the user’s spoken words, such as the date range during which the photos were taken.
You must apply the provided search criteria to the photos in your app. Because the user can specify multiple search parameters, you must be prepared to handle both basic and compound searches.
Handle Basic Searches
Consider the following command spoken to Siri: “Show me photos with Vignette.”
Siri forwards an
INSearch object to the Intents app extension of the Vignette app. The app extension returns a
Search object to Siri. This handler adopts the
INSearch protocol and handles all further interactions to resolve the intent parameters and handle the intent.
For all searches, the
Search object performs the search and provides a response containing the number of results and a request to Siri to display those results in the app.
After being launched, the app receives a user activity object containing the
INInteraction object with the intent and the app’s response. The app uses the
user property to retrieve the search results stored by the intent extension and then displays the results.
Both the app and the Intents app extension use the
Photo structure to manage the search criteria. This structure allows users to specify many different types of search criteria, and can be initialized directly from an intent object by resolving the optional properties on the intent.
Handle Compound Searches
When searching for photos, a user might include multiple search parameters in a spoken command, for instance: “Search Vignette for my favorite mountain photos.”
Because Siri recognizes the word favorite as a photo-related attribute, it adds the
favorite option to the
included property of the intent. Siri then adds “mountain” to the
search array of the intent. The code that filters the photos must consider both properties.
Users can ask Siri to search for photos that fall within a specific date range, and your app must validate that it can do so. For Vignette, there are three possible outcomes:
It is unable to execute a query with the specified date range.
It contains photos within the specified date range.
It does not contain photos within the specified date range.
First, date resolution checks whether the user provided a valid date range request, by inspecting the intent for a non-
nil value for its
date property. If
nil, the app passes a
not resolution result to the completion handler, which tells Siri that no value is needed for
date in order to search for photos. Siri then begins resolving other intent parameters. If your app does require a value for dateCreated in order to successfully execute a query, you can pass a
needs resolution result to the completion handler. This result indicates to Siri that your app cannot continue without a value for
date, and Siri prompts the user to provide a value.
The app is unable to successfully execute a query for photos if the provided creation date is beyond the known range of creation dates for the app’s photos. If the date is outside of that range, the app passes an
unsupported() resolution result to the completion handler.
If the user has provided a supported value for
date, the parameter is successfully resolved. Here, the app passes a
success(with:) resolution result to the completion handler. This result indicates to Siri that the app is done resolving
date and can begin resolving the next intent parameter.
The query can still return zero results if the app is unable to find any photos created on the date the user asked for. If the query returns photos for the requested date, the app displays them to the user. Siri notifies the user if the query does not return any photos.
Normalize Search Terms
“Show my animal photos with Vignette.”
“Show photos of animals using Vignette.”
Depending on the user’s spoken phrase, different forms of words, such as the singular and plural forms of the word dog, appear in the
search property of
INSearch. This sample augments the provided data set with multiple word forms as one solution, and uses an
NSLinguistic object to find the word stems for the search terms as another solution.
You may solve this problem in other ways, but every app using the
search property needs to handle this scenario.
Provide User-Specific Vocabulary
The Photos domain has two types of vocabulary that can be unique to each user: tags and album names. By providing Siri with the custom values set by your user for these properties, you enable Siri to better recognize these user-specific tags and album names. The vocabulary provided to Siri should be ordered with the most likely terms first.
Handle Albums Names and People Tags
When searching for photos, the user might want to look in a specific album or find people tagged in a photo. The
Search object returns the
not resolution for both of these cases, indicating that it does not support searches for these items. The resolution of album names involves matching the string provided by Siri to the album names in your app.
Test Siri Integration
This sample includes a UI test that demonstrates how to use the
XCUISiri class in a UI test to send test phrases to Siri and verify the results sent to the app.