Processes and saves a full-quality Live Photo as the output of your editing session.
- iOS 10+
- macOS 10.12+
- Mac Catalyst 13.0+Beta
- tvOS 10+
The photo editing output to receive the rendered Live Photo, created from the same
PHContentobject you used to begin this Live Photo editing context.
Options that affect Live Photo rendering. See Live Photo Processing Options.
A block that Photos calls on the main queue after rendering is complete. The block takes the following parameters:
YESif rendering succeeds; otherwise
If rendering succeeds, this parameter is
nil. If rendering fails, this parameter contains an error object describing the failure.
Use this method when you have finished an editing session and need to provide rendered output in a
PHContent object. Unlike when rendering output for a photo or video asset, you don’t need to provide rendered output using the
rendered property of the editing output object. Instead, create a
PHContent object using the
init initializer, passing the same
PHContent object you used in the
PHLive initializer to start this Live Photo editing context. Then pass that editing output object to this method, and Photos renders the Live Photo and provides it to the editing output.
After this method’s completion handler signals successful rendering, you use the content editing output to complete the edit. In an app using the Photos framework, create a
PHAsset object inside a
perform block, and set its
content property to your editing output. In a photo editing extension running in the Photos app, your main view controller provides content editing output when requested by the