- iOS 12.0+
- Xcode 11.0+Beta
- Core Image
The TrueDepth camera provides real-time depth data that allows you to segment foreground from background in a video feed.
This sample app leverages depth data to dynamically replace the entire background with a custom image. It then performs Gaussian filtering and other image processing operations to remove holes and smooth the effect.
Preview the Sample App
To see this sample app in action, build and run the project in Xcode on a device running iOS 11 or later. Because Xcode doesn’t have access to the TrueDepth camera, this sample won’t work in the Xcode simulator.
The sample app begins by removing the background, replacing it with black. Apply your own image from the camera roll by swiping down anywhere on the video feed.
Create a Binary Foreground Mask
Assume the foreground to be a human face. You can accomplish face detection through the Vision framework’s
VNDetect, but this sample doesn’t need anything else from Vision, so it’s simpler to consult the
AVMetadata, locate the face’s bounding box and center. Assume there is only one face and take the first one in the metadata object.
Depth maps differ from their normal camera image counterparts in resolution; as a result, normal image coordinates differ from depth map coordinates by a scale factor. Compute the scale factor and transform the face’s center to depth map coordinates.
Once you have the face in depth map coordinates, threshold the image to create a binary mask image, where the foreground pixels are
1, and the background pixels are
Smooth the Depth Mask with Core Image Filters
The depth map doesn’t share the RGB image’s sharp resolution, so the mask may contain holes along the interface between foreground and background. Once you have a downsampled mask image, use a Gaussian filter to smooth out the holes, so the interface doesn’t look jagged or pixelated. Clamp your image before filtering it, and crop it afterward, so it retains the proper size when applied with the original image.
The parameters of your
CIGamma filters directly affect the smoothness of the edge pixels. You can tune the blur and smoothness by adjusting the Gaussian blur filter’s input radius, as well as the gamma adjustment filter’s input power.
Blend Foreground and Background with the Alpha Matte
The final step is applying your filtered smooth binary mask to the input video frame.
Because you’ve performed image processing in Core Image using the
CIGamma filters, it’s most computationally efficient to apply the resulting mask in Core Image, as well. That means converting your video from CVPixelBuffer format to
CIImage format, allowing you to apply the alpha matte to the original image, and blend in your custom background image with the
Update your preview to display the final composited image onscreen.