An editing session for modifying the photo, video, and audio content of a Live Photo.
- iOS 10+
- macOS 10.12+
- Mac Catalyst 13.0+
- tvOS 10+
A Live Photo is a picture, captured by a supported iOS device, that includes motion and sound from the moments just before and after it was taken. Editing the content of a Live Photo works much like editing other asset types:
In an app using the Photos framework, fetch a
PHAssetobject that represents the Live Photo to edit, and use that object’s
requestmethod to retrieve a
Content Editing Input With Options: completion Handler:
In a photo editing extension that runs within the Photos app, your extension’s main view controller (which adopts the
PHContentprotocol) receives a
PHContentobject when the user chooses to edit a Live Photo with your extension.
Create a Live Photo editing context with the
With Live Photo Editing Input:
You can create a Live Photo editing context only from
PHContentobject that represents a Live Photo. Use the
liveproperty of the editing input to verify that it has live Photo content.
frameproperty to define a block to be used in processing the Live Photo’s visual content. Photos will call this block repeatedly to process each frame of the Live Photo’s video and still photo content.
PHContentobject to store the results of your edit, then call the
saveto process the Live Photo and save it to your editing output object. This method applies your
Live Photo To Output: options: completion Handler:
frameto each frame.
To allow a user to continue working with the edit later (for example, to adjust the parameters of a filter), create a
PHAdjustmentobject describing your changes, and store it in the
adjustmentproperty of your editing output.
In an app using the Photos framework, use a photo library change block to commit the edit. (For details, see
PHPhoto.) In the block, create a
PHAssetobject and set its
contentproperty to the editing output that you created.
When you use either of the methods listed in Processing an Editing Context’s Live Photo, Photos calls your
frame block repeatedly to process each frame of the Live Photo’s video and still photo content. In that block, a
PHLive object provides the Live Photo’s existing content as a
CIImage object. You use Core Image to modify the image, then provide the result of your edits by returning a
CIImage object representing the result of processing the input image.