Article

Supporting Continuity Camera in Your Mac App

Incorporate scanned documents and pictures taken with a user's iPhone, iPad, or iPod touch into your Mac app using Continuity Camera.

Overview

With Continuity Camera in macOS 10.14 and later, and iOS 12 and later, you can use your iPhone, iPad, or iPod touch to scan documents or take a picture of something nearby and then access those documents or pictures instantly from your app.

If your app works with images, this feature can be a convenient way to get images into the app. For example, a text-editing app could use this feature to easily incorporate images into a document. It could also be a handy way to bring images into a social media app.

Apps using NSTextView get Continuity Camera support automatically. When the user control-clicks in an app’s text view, a Continuity Camera menu item is automatically displayed. The user can then capture a photo or scan a document on their iPhone or iPad. Once a photo or scan is taken, it automatically appears in the text view. The image is then accessible as an attachment in the NSTextView text storage object.

Screenshot showing a text view in which the user has control-clicked to display the Continuity Camera dialog for taking a photo on an iOS device. The dialog shows the name of the iOS device where the photo will be taken and a cancel button to allow the user to dismiss the dialog without actually taking a photo.

If you’re not using NSTextView, you need to add support to your macOS interface to enable Continuity Camera and to merge photos taken from the user’s iOS device.

Enable Support in Your Responder Objects

You must tell AppKit that your app can take advantage of any image data originating from Continuity Camera. You do this in responder objects; for example, in a view controller.

When the Continuity Camera menu item is displayed, AppKit calls the validRequestorForSendType:returnType: method of your responder objects in the current responder chain to find an object that can handle image data generated by Continuity Camera. Override this method to let AppKit know that your responder object supports image data produced by Continuity Camera. When the user actually captures a photo or scans a document using Continuity Camera, AppKit places the image data on the pasteboard and calls the designated responder object to handle the data.

Your responder’s validRequestor(forSendType:returnType:) implementation must verify that it can receive pasteboard image data of the specified type, then return the object to receive the image data when it's placed on the pasteboard by AppKit. Your validRequestor(forSendType:returnType:) method can designate the same receiver object to handle the image data.

Here’s an example implementation:

override func validRequestor(forSendType sendType: NSPasteboard.PasteboardType?, returnType: NSPasteboard.PasteboardType?) -> Any? {
    if let pasteboardType = returnType,
        // Service is image related.
        NSImage.imageTypes.contains(pasteboardType.rawValue) {
        return self  // This object can receive image data.
    } else {
        // Let objects in the responder chain handle the message.
        return super.validRequestor(forSendType: sendType, returnType: returnType)
    }
}

Note that your validRequestor(forSendType:returnType:) method can return a different object to receive the image data. For example, implement the validRequestor(forSendType:returnType:) method in your view controller and perform the checks, but return a view object to actually incorporate the data. You could also return a parent or managing object instead. To illustrate, this code implements validRequestor(forSendType:returnType:) in a window controller but returns the active view controller to target the pasted image:

override func validRequestor(forSendType sendType: NSPasteboard.PasteboardType?, returnType: NSPasteboard.PasteboardType?) -> Any? {
    if let pasteboardType = returnType,
        // Service is image related.
        NSImage.imageTypes.contains(pasteboardType.rawValue) {
        // Specify the active view controller to receive the image data.
        return self.contentViewController  
    } else {
        // Let objects in the responder chain handle the message.
        return super.validRequestor(forSendType: sendType, returnType: returnType)
    }
}

After you implement validRequestor(forSendType:returnType:) and specify an object to receive the image data, AppKit enables the Continuity Camera menu item for the designated menus in your app, including contextual menus associated with your view.

Add a Continuity Camera Menu Item

The user initiates Continuity Camera by using a menu item in one of your app’s menu-bar menus or contextual menus. You can add a Continuity Camera menu item to any of your app’s menu-bar menus, or have AppKit automatically add a Continuity Camera menu item to one of your app's contextual menus. A good place to include a Continuity Camera menu item is in menus that contain options for performing editing-related activities (for example, the File and Insert menus).

To add a Continuity Camera menu item to one of your app’s menu-bar menus, locate the storyboard file where your menu bar is defined, and follow these steps in Interface Builder:

  1. Add an item to your app’s menu.

  2. Set the name of the item; for example, “Take Picture.” AppKit provides the actual name later.

  3. In the Identity inspector, set the Identifier property of the menu item to NSMenuItemImportFromDeviceIdentifier (defined in NSMenuItem.h).

Here’s how it looks:

Screenshot of a storyboard for an app's menus, showing the settings described in the preceding steps.

You don’t add a Continuity Camera menu item directly to your app’s contextual menus. Instead, you enable the appropriate support in your app’s responder objects as described in the previous section, and AppKit adds the menu item for you.

For example, the following code demonstrates how to display a contextual menu in response to a mouse-down event, and have AppKit insert the menu item. This code overrides the mouseDown: method and creates a menu. It then invokes the NSMenu class method popUpContextMenu:withEvent:forView:, passing the event object related to the mouse-down event and the view owning the contextual menu. AppKit automatically inserts the Continuity Camera menu item in the contextual menu for you.

override func mouseDown(with event: NSEvent) {    
    let theMenu = NSMenu(title: "Contextual Menu")
    /*
     Display a contextual menu over the view for the mouse-down event.
     AppKit automatically inserts the Continuity Camera menu item.
     */
    NSMenu.popUpContextMenu(theMenu, with: event, for: self.view)
}

When the user selects the Continuity Camera menu item, the system automatically launches the Continuity Camera interface on the user's device. After the user captures an image, AppKit places that image on the app's pasteboard.

Incorporate the Image Data from the Pasteboard

You need to incorporate images captured from Continuity Camera into your app. Images captured from the user’s iOS device using Continuity Camera are placed on the pasteboard. AppKit then calls the active responder object’s readSelectionFromPasteboard: method to read the image data. The readSelection(from:) method supports Continuity Camera image data and other types of data in your app. Use that method to determine if the image is in a format your app supports, and incorporate that image data into your app.

Here's an example implementation of the readSelection(from:) method:

func readSelection(from pasteboard: NSPasteboard) -> Bool {
    // Verify that the pasteboard contains image data.
    guard pasteboard.canReadItem(withDataConformingToTypes: NSImage.imageTypes) else {
        return false
    }
    // Load the image.
    guard let image = NSImage(pasteboard: pasteboard) else {
        return false
    }
    // Incorporate the image into the app.
    self.myImageView.image = image
    // This method has successfully read the pasteboard data.
    return true
}

See Also

User Interface

Supporting Dark Mode in Your Interface

Update colors, images, and behaviors so that your app adapts automatically when Dark Mode is active.

Views and Controls

Present and define the interactions for your content onscreen.

View Management

Manage your user interface, including the size and position of views in a window.

Menus, Cursors, and the Dock

Implement menus and cursors to facilitate interactions with your app, and use your app's Dock tile to convey updated information.

Windows, Panels, and Screens

Organize your view hierarchies and facilitate their display onscreen.

Touch Bar

Display interactive content and controls in the Touch Bar.

Animation

Animate your views and other content to create a more engaging experience for users.

Sound, Speech, and Haptics

Play sounds and haptic feedback, and incorporate speech recognition and synthesis into your interface.