- iOS 11.3+
- Xcode 10.0+
This sample app uses SceneKit’s node-based audio API to associate environmental sounds with a virtual object that’s placed in the real world. Because audio is 3D positional in SceneKit by default, volume is automatically mixed based on the user’s distance from a node.
This sample code supports
Relocalizationand therefore, it requires ARKit 1.5 (iOS 11.3) or greater
ARKit is not available in the iOS Simulator
Building the sample requires Xcode 9.3 or later
Run an AR Session and Place Virtual Content
Before you can use audio, you need to set up a session and place the object from which to play sound. For simplicity, this sample runs a world tracking configuration and places a virtual object on the first horizontal plane that it detects. For more detail about this kind of session setup, see Tracking and Visualizing Planes. The object placement approach in this sample is similar to the one demonstrated in Placing Objects and Handling 3D Interaction.