How do we make video calls with a Front Camera while having a Augmented Reality with the Rear Camera?

We are developing an augmented reality (AR) game for iOS. In the game scene, we need to use both the iPhone Rear Camera and the Front Camera. In the case of Augmented Reality with the Rear Camera, we need to use the Front Camera to make video calls at the same time. In the process of application software development, we use the Rear Camera as ARCamera, and through the configuration ARWorldTrackingConfiguration class to create a ARSession instance, then use ARSCNView object to build the AR view shows, Augmented Reality with the Rear Camera we've done. But there was a problem. We couldn't get the video stream from the Front Camera at the same time.

The question is: How do we make video calls with a Front Camera while having a Augmented Reality with the Rear Camera?We want the details of implementation.

You cannot access both the front and rear camera streams during an ARSession. You can access them both in an AVCaptureMultiCamSession, but then you cannot use ARKit for your AR, so you would need to then provide your own AR implementation, which is very complex and not something that we can assist you with.

If you'd like access to both the front and rear camera streams during an ARSession, please file an enhancement request using Feedback Assistant.

How do we make video calls with a Front Camera while having a Augmented Reality with the Rear Camera?
 
 
Q