Introducing ARKit 4
The advanced scene understanding capabilities built into the LiDAR Scanner allow this API to use per-pixel depth information about the surrounding environment. When combined with the 3D mesh data generated by Scene Geometry, this depth information makes virtual object occlusion even more realistic by enabling instant placement of virtual objects and blending them seamlessly with their physical surroundings. This can drive new capabilities within your apps, like taking more precise measurements and applying effects to a user’s environment.
The Depth API is specific to devices equipped with the LiDAR Scanner (iPad Pro 11-inch (2nd generation), iPad Pro 12.9-inch (4th generation), iPhone 12 Pro, iPhone 12 Pro Max).
Place AR experiences at specific places, such as throughout cities and alongside famous landmarks. Location Anchoring allows you to anchor your AR creations at specific latitude, longitude, and altitude coordinates. Users can move around virtual objects and see them from different perspectives, exactly as real objects are seen through a camera lens.
Requires iPhone XS, iPhone XS Max, iPhone XR, or later. Available in select cities.
Expanded Face Tracking Support
Support for Face Tracking extends to the front-facing camera on any device with the A12 Bionic chip and later, including iPhone SE, so even more users can delight in AR experiences using the front-facing camera. Track up to three faces at once using the TrueDepth camera to power front-facing camera experiences like Memoji and Snapchat.