Hey,
I don't have the iPhone X yet so I am trying to understand if what I want to do is possible.
In my app, I already use the rear camera to display AR content. Now, I would like to capture data about the face expressions of the user (without showing the output of the front camera). Did I explained myself clear enought? Would that be possible? It sound not trivial to do a session using both cameras.
To sum up:
I need a session for the rear camera, the rear camera is displayed on screen.
I also need a session for the front camera, I will take relevant face expression info from it but I am not displaying the front camera output on screen.
Thanks 🙂