This issue affects camera session behavior and UI integration.
I would like to request improved support or clarification regarding the simultaneous use of ARFaceTrackingConfiguration and AVCaptureMultiCamSession.
Currently, when attempting to use both:
Front camera (TrueDepth) for gaze tracking using ARFaceTrackingConfiguration
Rear camera for live preview using AVCaptureMultiCamSession
the ARKit face tracking stops updating, or the application becomes unstable (e.g., camera preview turns white or the app crashes).
Steps to Reproduce:
Start ARSession using ARFaceTrackingConfiguration (front camera)
Start AVCaptureMultiCamSession using rear camera
Overlay both outputs in a single UI
Observe that ARKit tracking stops or camera preview becomes invalid
Expected Result:
ARKit face tracking continues updating while the rear camera is active.
Actual Result:
ARKit tracking stops updating, and camera output may become unstable or crash.
Use Case:
This functionality is important for accessibility and educational applications.
For example, users can control UI via gaze input (front camera) while observing real-world objects using the rear camera.
Request:
Support simultaneous use of ARFaceTrackingConfiguration and AVCaptureMultiCamSession, or
Improve resource sharing between TrueDepth and rear cameras, or
Provide clear documentation about current limitations
This feature would significantly enhance accessibility applications on iPad.
Attachment:
A photo is attached showing the issue on a real iPad device.
In the image, the camera preview becomes white while the application is running, indicating unstable behavior when both ARKit face tracking and rear camera capture are active simultaneously.
Topic:
Media Technologies
SubTopic:
Photos & Camera
0
0
4