Skip Navigation
Framework

PHASE

Create dynamic audio experiences in your game or app that react to events and cues in the environment.

Overview

Use PHASE (Physical Audio Spatialization Engine) to provide complex, dynamic audio experiences in your games and apps. With PHASE, you can control sound layers and adjust audio parameters in real time. As you develop your app, dynamic integration with your app’s visual scene enables audio to react to logic and visual changes automatically. The framework supports various audio hardware, which enables your app to provide a consistent spatial audio experience across platforms and output devices like headphones and speakers.

Illustration of in-game scenes that demonstrate PHASE features. At left, a polygon contains a dragon with a callout that reads Volumetric sound source. A sound wave emits from the dragon to a hero. A tree structure extends outward from the dragon with a callout that reads Sound event hierarchy. The tree structure highlights a specific path from its root node to one of its leaf nodes. The leaf node contains a sound wave, which indicates a particular sound wave that emits from the dragon. At right, a dragon fireball collides with a rock. Sound waves emit outward from the fireball, except in the area behind the rock. A callout extends from the area that reads Geometric sound occlusion.

Integrate Audio with Visual Simulation

Apps and games that model a detailed environment involve substantial revision during development. When you provide PHASE with a basic understanding of your app’s scene, audio plays in accordance with the scene’s characteristics. As you modify the scene, such as by adding a game level, the audio follows along by accommodating the level’s visual shape and properties. PHASE couples sound with visuals and minimizes your app’s audio maintenance by:

  • Accepting scene geometry and reducing the volume of obstructed, sound-emitting scene objects. For example, PHASE lowers the volume of an incoming fireball when the player takes cover behind a wall.

  • Offering complex sound events that play in reaction to your app’s runtime state.

  • Adding sound effects that emanate from a shape. When you provide the shape of a scene object to PHASE, the sound’s volume scales based on the player’s distance and orientation relative to the shape.

  • Adding reverberation and timed audio reflection to create environmental effects and simulate indoor scenes.

Topics

Initialize an engine object and prepare your app’s audio data for playback.
Lay out objects in 3D space to play PHASE audio at runtime that’s consistent with your app’s visual scene.
Create a hierarchy of nodes that tailors playback based on your app’s current state.
Choose among the various ways to play sound depending on your app’s unique audio-playback needs.
Apply mathematical functions or custom logic to change the properties of in-flight audio or the conditions under which audio plays.
Change the characteristics of a group of sounds, such as altering their volume when your app transitions to a menu.
Current page is PHASE