Get Ready for ARKit 3

ARKit 3 goes further than ever before, naturally showing AR content in front of or behind people using People Occlusion, tracking up to three faces at a time, supporting collaborative sessions, and more. And now, you can take advantage of ARKit’s new awareness of people to integrate human movement into your app.

People Occlusion

Now AR content realistically passes behind and in front of people in the real world, making AR experiences more immersive while also enabling green screen-style effects in almost any environment.

Motion Capture

Capture the motion of a person in real time with a single camera. By understanding body position and movement as a series of joints and bones, you can use motion and poses as an input to the AR experience — placing people at the center of AR.

Simultaneous Front and Back Camera

Now you can simultaneously use face and world tracking on the front and back cameras, opening up new possibilities. For example, users can interact with AR content in the back camera view using just their face.

Multiple Face Tracking

Now ARKit Face Tracking tracks up to three faces at once, using the TrueDepth camera on iPhone X, iPhone XS, iPhone XS Max, iPhone XR, and iPad Pro to power front-facing camera experiences like Memoji and Snapchat.

Collaborative Sessions

With live collaborative sessions between multiple people, you can build a collaborative world map, making it faster for you to develop AR experiences and for users to get into shared AR experiences like multiplayer games.

Additional Improvements

Detect up to 100 images at a time, and get an automatic estimate of the physical size of the image. 3D-object detection is more robust, as objects are better recognized in complex environments. And now, machine learning is used to detect planes in the environment even faster.