Sample Code

Searching for Photos

Use SiriKit to search for photos managed by your app.



SiriKit delivers the user’s spoken search criteria to your app in the form of an INSearchForPhotosIntent object. This object contains search criteria derived from the user’s spoken words, such as the date range during which the photos were taken.

Image of the PhotoIntent’s UI

You must apply the provided search criteria to the photos in your app. Because the user can specify multiple search parameters, you must be prepared to handle both basic and compound searches.

Handle Basic Searches

Consider the following command spoken to Siri: “Show me photos with Vignette.”

Siri forwards an INSearchForPhotosIntent object to the Intents app extension of the Vignette app. The app extension returns a SearchForPhotosHandler object to Siri. This handler adopts the INSearchForPhotosIntentHandling protocol and handles all further interactions to resolve the intent parameters and handle the intent.

For all searches, the SearchForPhotosHandler object performs the search and provides a response containing the number of results and a request to Siri to display those results in the app.

let userActivity = NSUserActivity(activityType: ActivityKeys.SearchForPhotosActivity)
userActivity.userInfo = [ActivityKeys.SearchForPhotosActivityResult: foundPhotos.photoIdentifiers]
let response = INSearchForPhotosIntentResponse(code: .continueInApp, userActivity: userActivity)

// The result count allows the system to confirm launch of the app if needed, such as when there are no photos matching the query.
response.searchResultsCount = foundPhotos.photoIdentifiers.count

After being launched, the app receives a user activity object containing the INInteraction object with the intent and the app’s response. The app uses the userActivity property to retrieve the search results stored by the intent extension and then displays the results.

guard userActivity.activityType == ActivityKeys.SearchForPhotosActivity,
    let photoIdentifiers = userActivity.userInfo?[ActivityKeys.SearchForPhotosActivityResult] as? [String]
    else {
        fatalError("Cannot proceed with `NSUserActivity`. Verify the `NSUserActivity` has been properly configured.")

 Reconstructing the query here to show that the content of the Siri intent is present in the user activity.
 This query object is only used to update the UI.
if let searchForPhotosIntent = userActivity.interaction?.intent as? INSearchForPhotosIntent {
    let siriQuery = PhotoQuery(intent: searchForPhotosIntent)
    imagesViewController.updateQueryUI(with: siriQuery)

// Take the results from when the Intent extension ran the query and display the results.
let photoQueryResult = PhotoQueryResult(identifiers: photoIdentifiers)
imagesViewController.updatePhotos(with: photoQueryResult)

Both the app and the Intents app extension use the PhotoQuery structure to manage the search criteria. This structure allows users to specify many different types of search criteria, and can be initialized directly from an intent object by resolving the optional properties on the intent.

Handle Compound Searches

When searching for photos, a user might include multiple search parameters in a spoken command, for instance: “Search Vignette for my favorite mountain photos.”

Because Siri recognizes the word favorite as a photo-related attribute, it adds the favorite option to the includedAttributes property of the intent. Siri then adds “mountain” to the searchTerms array of the intent. The code that filters the photos must consider both properties.

let queryResults = photos.filter { photo in
    guard query.creationDateRange.contains(photo.created) else { return false }

    // Checks if the user specified for a favorite or not favorite photo.
    if let isFavorite = query.favorite {
        guard photo.favorite == isFavorite else { return false }

    if let albumName = query.album {
        guard photo.albums.contains(albumName.lowercased()) else { return false }

     Finally, now that we've reduced the number of possibilities,
     check that the query matches based on the photo tags.
    return query.matches(photoTags: photo.tags)

Resolve Dates

Users can ask Siri to search for photos that fall within a specific date range, and your app must validate that it can do so. For Vignette, there are three possible outcomes:

  1. It is unable to execute a query with the specified date range.

  2. It contains photos within the specified date range.

  3. It does not contain photos within the specified date range.

First, date resolution checks whether the user provided a valid date range request, by inspecting the intent for a non-nil value for its dateCreated property. If dateCreated is nil, the app passes a notRequired() resolution result to the completion handler, which tells Siri that no value is needed for dateCreated in order to search for photos. Siri then begins resolving other intent parameters. If your app does require a value for dateCreated in order to successfully execute a query, you can pass a needsValue() resolution result to the completion handler. This result indicates to Siri that your app cannot continue without a value for dateCreated, and Siri prompts the user to provide a value.

guard let dateRange = intent.dateCreated,
    let dateResolver = DateResolver(dateRange: dateRange) else {

Unsupported Resolution

The app is unable to successfully execute a query for photos if the provided creation date is beyond the known range of creation dates for the app’s photos. If the date is outside of that range, the app passes an unsupported() resolution result to the completion handler.

guard dateResolver.isSupportedDateRange else {

Successful Resolution

If the user has provided a supported value for dateCreated, the parameter is successfully resolved. Here, the app passes a success(with:) resolution result to the completion handler. This result indicates to Siri that the app is done resolving dateCreated and can begin resolving the next intent parameter.

completion(.success(with: dateRange))

The query can still return zero results if the app is unable to find any photos created on the date the user asked for. If the query returns photos for the requested date, the app displays them to the user. Siri notifies the user if the query does not return any photos.

Normalize Search Terms

“Show my animal photos with Vignette.”

“Show photos of animals using Vignette.”

Depending on the user’s spoken phrase, different forms of words, such as the singular and plural forms of the word dog, appear in the searchTerms property of INSearchForPhotosIntent. This sample augments the provided data set with multiple word forms as one solution, and uses an NSLinguisticTagger object to find the word stems for the search terms as another solution.

let options: NSLinguisticTagger.Options = [.joinNames, .omitPunctuation, .omitWhitespace]
searchTermString.enumerateLinguisticTags(in: fullRange, scheme: NSLinguisticTagScheme.lemma, options: options, orthography: nil) { tag, subrange, _, _ in
    let resultString: String
    if let tag = tag {
        resultString = tag.rawValue
    } else {
         If a word stem could not be identified because there is not enough surrounding grammatical context
         (such as a single word), keep the original word.
        resultString = searchTermString.substring(with: subrange)
    lemmatizedSearchTerms += resultString + " "

You may solve this problem in other ways, but every app using the searchTerms property needs to handle this scenario.

Provide User-Specific Vocabulary

The Photos domain has two types of vocabulary that can be unique to each user: tags and album names. By providing Siri with the custom values set by your user for these properties, you enable Siri to better recognize these user-specific tags and album names. The vocabulary provided to Siri should be ordered with the most likely terms first.

INVocabulary.shared().setVocabularyStrings(dataSource.rankedTags, of: .photoTag)
INVocabulary.shared().setVocabularyStrings(dataSource.rankedAlbumNames, of: .photoAlbumName)

Handle Albums Names and People Tags

When searching for photos, the user might want to look in a specific album or find people tagged in a photo. The SearchForPhotosIntentHandler object returns the notRequired() resolution for both of these cases, indicating that it does not support searches for these items. The resolution of album names involves matching the string provided by Siri to the album names in your app.

Test Siri Integration

This sample includes a UI test that demonstrates how to use the XCUISiriService class in a UI test to send test phrases to Siri and verify the results sent to the app.

See Also

Search for Photos

protocol INSearchForPhotosIntentHandling

The handler interface for searching the user’s photos.

class INSearchForPhotosIntent

A request for the list of photos that match the specified criteria.

class INSearchForPhotosIntentResponse

Your app’s response to a search for photos intent.