Instance Property

frameProcessor

A block to be called by Photos for processing each frame of the Live Photo’s visual content.

Declaration

var frameProcessor: PHLivePhotoFrameProcessingBlock? { get set }

Discussion

Use this property to define the image processing to be performed on each frame of the Live Photo. Setting this property does not begin processing; instead, after you call one of the methods listed in Processing an Editing Context’s Live Photo, Photos executes your block repeatedly to process each frame of the Live Photo’s video and still photo content.

In your frame processor block, use the image property of the provided PHLivePhotoFrame object to access the image to be processed, and return a CIImage object representing the result of your processing. For example, the following code sets up a processor block to apply a simple sepia-tone filter, then calls the saveLivePhoto(to:options:completionHandler:) method to begin processing the Live Photo for output.

 
func processLivePhoto(input: PHContentEditingInput) {
    guard let context = PHLivePhotoEditingContext(livePhotoEditingInput: input)
        else { fatalError("not a Live Photo editing input") }
    context.frameProcessor = { frame, _ in
        return frame.image.applyingFilter("CISepiaTone", withInputParameters: nil)
    }
    let output = PHContentEditingOutput(contentEditingInput: input)
    context.saveLivePhoto(to: output) { success, error in
        if success {
            // use output with PHAssetChangeRequest or PHContentEditingController
        } else {
            print("can't process live photo: \(error)")
        }
    }
}

See Also

Preparing an Editing Context for Processing

typealias PHLivePhotoFrameProcessingBlock

The signature for a block Photos calls to process Live Photo frames.

var audioVolume: Float

The audio gain to apply to the processed Live Photo.