I have built a camera application which uses a AVCaptureSession
with the AVCaptureDevice
set to .builtInDualWideCamera
and isVirtualDeviceConstituentPhotoDeliveryEnabled=true
to enable delivery of "simultaneous" photos (AVCapturePhoto
) for a single capture request.
Our app ideally would have the timestamp difference between the photos in a single capture request as short as possible, but we don't have a good idea of what the theoretical or practical limits of this timestamp difference are.
In my testing on an iPhone 12 Pro, with a frame rate of 33Hz and the preset set to hd1920x1080
, I get the timestamp difference between photos at approx 0.3ms, which seems smaller than I would expect, unless the frames are being synchronised incredibly well under the hood.
This leaves the following unanswered questions:
- What sort of ranges of values should we expect to come out of these timestamp differences between photos?
- What factors influence this?
- Is there any way to control these values to ensure they are as small as possible? (Will likely be answered by (2))
Hi @nanders, this is only tangentially related: would you be willing to detail your configuration used to get both images of the
.builtInDualWideCamera
? I've been playing around with doing the same exact thing but haven't been able to find any setup whereisVirtualDeviceConstituentPhotoDeliverySupported
istrue
, and I don't see much documentation out there on this topic. Are there specific settings you changed to make this possible?Thanks!