Sound Integration In Unity


While there are fantastic middleware tools out there such as Wwise and FMOD, Unity itself enables you to do quite a bit with your sound files. For example, you can adjust distance attenuation, pitch (linked to a time stretch), and automate low-pass filters. So for those that prefer not to deal with additional software, Unity has basic sound integration well covered. In fact, for Unity 5, they plan to have FMOD fully integrated into the Unity editor, which will provide even more in-the-box audio control.

If your game is built in Unity, there are a few components you will need to be familiar with, even if you’re using a sophisticated sound engine like Wwise. First, each scene needs to have an Audio Listener, which you can think of as the player’s ears. The Listener object is generally placed on the camera, or a first-person controller. Audio effects can be applied directly to the Listener, in cases where you want to affect all the audible sounds in the scene equally.

From there, you need to create Audio Source objects to play back your sound files. This is how you place your sounds in the scene or attach them to another game object, like a moving enemy. This is also where you can access a variety of options that determine the playback of that particular sound.

One of the coolest and most effective things you can do in Unity to affect your sounds, is add Reverb Zones. These objects can be placed in different parts of the scene, like in different rooms of a building. You can then determine a reverb effect that will be applied to all sounds that occur inside of that Reverb Zone, like the player’s footsteps for instance.

Reverb Zone showing the Min and Max distances

Reverb Zone showing the Min and Max distances

You can use reverb to emulate a particular acoustic quality or space, such as a large reflective (echoey) room, like a Cathedral. This allows you to have contrasting ambient colors across rooms or spaces of differing sizes and materials in your game. For example the carpeted bedroom could have little or no reverb, while the tile bathroom next door has high reflections with a low delay between them (because it’s a small space with highly reflective walls and surfaces). Meanwhile, the large marble foyer of the mansion can have even more reflections, with a longer delay delay between them (because it’s not only a large space but it also has highly reflective marble surfaces). This way an identical yell sound, for instance, will sound very different depending on which room it occurs in, heightening the sense of realism and immersion for the player without needing to create any additional sound assets.

We don’t exactly have any yelling sounds in any of our games, nor any mansions. But reverb can also be used to simulate the acoustics of a forest, or a canyon. You can also differentiate Min and Max distances of a Reverb Zone, which determine the respective radii at which sounds played will begin to have the reverb effect applied (Max) and have the effect applied to the full extent (Min). Unity has a number of presets to choose from, such as: Cave, Auditorium, and Living Room, but you can also create your own reverb effect by adjusting the various parameters in the inspector.

All of these things can be done without a lick of programming. However, you can gain more sophisticated control over your sound by creating and adding scripts to these different audio objects in your scene. More on that in the future.

-Eric

 

Leave a comment

Your email address will not be published. Required fields are marked *