Hey,
I have a camera app that captures a ProRaw photo and then runs a few Core Image filters before saving it to the device as a HEIC. However I'm finding that capturing at 48MP is rather slow. Testing a minimal pipeline on an iPhone 16 Pro:
- Shutter press => file received in output: 1.2 ~ 1.6s
- CIRawFilter created using photo file representation then rendered to context, without any filters: 0.8s ~ 1s
- Saving to device ~0.15s
Is this the expected time for capturing processing? The native camera app seems to save the images within half a second. I'm using QualityPrioritization.balanced
and the highest resolution available which is 48MP.
Would using the CIRawFilter with the pixelBuffer from the photo output be faster? I tried it but couldn't get it to output an image. Are there any other things I could try to speed this up? Is it possible to capture at 24MP instead?
Thanks, Alex
There is a newish API for deferred photo processing, which makes the whole capture process seem much faster. That's probably what the Camera app is doing under the hood.
Check out this WWDC session for details.