AES Regions

AES Section Meeting Reports

Los Angeles - April 29, 2014

Meeting Topic:

Moderator Name:

Speaker Name:

Meeting Location:

Summary

On Tuesday, April 29, 2014, the Los Angeles section of the Audio Engineering Society was visited by Jory Prum, owner and founder of Studio Jory, a Marin County-based recording and audio mixing studio. For the last fifteen years, Mr. Prum's primary work has been sound design for video games, and his talk detailed the particular challenges a sound designer and audio engineer faces when creating content for this medium. Mr. Prum began his career in the Los Angeles area, at Jim Henson's Creature Shop, followed by a stint at Disney where he worked for approximately two years. In 1999 in moved to the Bay Area to take a position at LucasArts, where he worked on numerous video games titles, including some from the Star Wars franchise. After a few years, he decided to build Studio Jory in Fairfax, where he has done the sound design for such games as Knights of the Old Republic, Telltale Games' The Walking Dead and Tales of Monkey Island episodic games, and EA's The Sims 2.
Mr. Prum began the presentation by asking the audience about their own backgrounds, and, as it turns out, only a few in the audience had had any experience working with video games. He explained that one of the main differences between film and video work, and video games, was what defined the "work unit." He explained: "If you're working in film your work unit would generally be the reel, right? You have five double reels or ten single-sized reels and you've got a ProTools session for every reel and maybe you split that up a little further with your foley in one session and your sound design in another and music in another. That ProTools session is keeping track of everything for you." In contrast, in video games, the work unit is the "asset." Going on, he says, "Assets could be things like sound effects. A line of dialogue would be an asset. A music cue could be an asset."
The main reason for this different definition of the "work unit" is that creating audio for video/computer games is quite different from creating the mix for a television or film. In the case of mixing for a piece of filmed entertainment, the engineer ultimately is creating a fixed product, one that will be the same every time the product is played and viewed. Different versions are often created, in order to obtain sales in foreign markets, most particularly foreign-language dubbing, but sounds effects and music most frequently are untouched, even when dialogue tracks are replaced wholesale. The viewer has no control over the audio experience, aside from adjusting the overall volume, or perhaps buying a better pair of speakers. In contrast, video games present a dynamic environment. The audio must be able to adapt to video scenes that are constantly evolving and being created on the fly. No particular run of the game is ever the same as the previous one, partly due to some randomization injected by the game engine itself, but primarily because the player is always attempting to learn how to "beat" the program and thus is constantly changing his or her inputs to the game.
Thus, a video game sound designer has two tasks: First, to organize media so that it can be found and accessed quickly, both in the process of playing the game and in the process of creating the appropriate sounds for the game; but then to connect the sounds to the actual play of the game, so that the right sounds (dialogue, sound effects, and music) play when it's appropriate and effective. Not anything is possible, however; there are the inherent limitation in software and hardware for each video game platform, and understanding and dealing with them is also one of the sound designer's primary tasks.
Mr. Prum's career has spanned some of the most rapid changes in software and hardware in the industry, and so what used to be severe limitations have relaxed quite a bit: "When I am working on a game they're going to tell me you have X number of megs of RAM. And by the way, that could be incredibly, drastically small. The last game I worked on for the PlayStation 1, which admittedly was fourteen years ago, the RAM allocated for sound was 2.5 Megs. Now that's a 660 Meg disk all formatted and we get 2.5 Megs, and eight bits." Mr. Prum has seen the basic sound capabilities change from 8 bit to 16 bit, and audio memory expand over time, but he has constantly had to use every technique available to squeeze the best possible quality of sound and experience into the space he is allotted by the game designers. Some examples: Reducing the sampling rate, sometimes to 5-1/2 KHz; using MIDI for all music — nothing sampled whatsoever; combining a basic set of sounds on the fly to create new variations; changing pitch and duration on the library of sounds. He describes the process of trying to create the best sound possible, but being pushed back on by the other needs of the game: "I tend to use techniques that use more memory than I'm supposed to, because invariably they are going to come to the sound people first and say, 'You're using too much RAM. You need to cut back.' So I always use up more than I should because I can at least pare back for them."
The choice of platform is a key determinant of what is possible. Games exist for the PC, Mac, iPad, iPhone, Android, Xbox, Xbox 360, Xbox One, Nintendo DS, Sony PlayStation 3 and 4. All of these platforms have inherent limitations that must be dealt with, and if a game is going to be released for several platforms simultaneously, often the audio must take the lowest common denominator: "One title I worked on decided they were going to ship on the Wii. This is a number of years ago and the Wii has some interesting challenges. If you buy a disk for the Wii you've got the whole disk, no problem, that's easy. But the downloadable titles on the Wii have to fit in 40 Megs. That's everything, that's the whole game. Everything — music, sound, visuals, everything." After that experience, the developer decided never to develop for the Wii again, in order not to limit what was possible to do on the other platforms.
Other considerations are the intended audio playback devices. A game may be playing back through desktop computer speakers, through an iPhone or iPad speaker, through a 5.1 system and a plasma screen, or a set of headphones, and the sound designer has no control over that. Much like a music engineer who double-checks the mix on a cheap set of speakers, after mixing on his best pair, a videogame sound designer must create a sound that can still work on a pair of earbuds or the microspeaker on a phone.
Furthermore, specific audio codecs create their own limitations. Not all codecs will work everywhere. As Mr. Prum describes, unlike a CD, where the formats are quite well defined (PCM, MP3), an OGG file will play in a browser in Firefox, but it won't work on an iPhone, but you can use the iPhone-compatible AAC codec in all other browsers except Firefox. MP3s create their own issues — in a standard encode, they will not loop seamlessly, leaving an unpredictably-sized gap. The only solution in that case was custom software to play back the MP3 loop correctly.
Sometimes even an entire audio engine will be removed from a game engine. As Mr. Prum describes, "The project that I'm working on, I learned in March that the lead programmer ripped out the entire audio system. I spent weeks implementing 900 sounds effects in 85 events, and timing them in getting everything to work just right, and it's all been replaced with another game engine, and I get to start over. So things change and you don't actually have control or predictability."
Mr. Prum went on to describe the tools and techniques he regularly uses: "We do use a lot of the same tools: ProTools or Nuendo or Cubase or Reaper, whatever Digital Audio Workstation you're used to. We use things like Peak for editing files for delivery." But he also uses less common tools, like waveform editors — "you have to have very precise output. Destructive is something I do on a constant basis. I use destructive record in ProTools every day and I bet very few people in this room have ever used destructive record." Other tools include Audio Finder and Sample Manager, which are designed to quickly rifle through large sets of files. Tools for DJs are commonly used in order to find, track, and manipulate files. But again, because audio needs to be output at particular stages of a game, a video game sound designer must work with audio middleware: "FMOD is for doing implementation in the game. It looks very similar to a digital audio workstation in that you are going to lay in all of your tracks, but it actually allows you to set up the events that will be created and called when something occurs inside a game."
In some cases Mr. Prum has developed his own tools to make the process easier, or even possible, and describes what's on the screen: "This is the script format that we use for recording The Wolf among Us. It's actually a script format done completely in HTML 5 and it's got the web audio API audio engine which is a part of web standards now where we can play the audio. We can play audio directly in the script and we can actually use this to master the entire voice set. It will export all the changes that you have chosen and then allow you to batch process that against just the files that you made changes to. And this is something I wrote over the course of the weekend, because it just doesn't exist." Ultimately to integrate the sounds being created into the full functionality of the game, an enormous amount of coordination must occur — the files must be organized, named exactly per the agreed-upon specifications, and timed and edited precisely, and there is no all-in-one environment in which to do this.
The most commonly used game engine right now is a program called Unity, and Mr. Prum showed how a recent project called Ur Warlords works within this environment. He showed how he captures gameplay using a program called "Screenflow" first to determine the overall flow and logic of the scenes before he starts modifying them. For this game, he had to create a vast array of sounds: "Rooms barricade, crack sounds, magic sounds, projectile sounds, rocket sounds, rubble." All of these are organized into subcategories such as ambiences, characters, footsteps, other rooms, UI, and weapons, then accessed through Unity. Once within the game environment, other choices must be made: "What is the point of view of the hearer? Should the listener be attached to the camera, attached to the player character, or somewhere in between? You kind of have to play with it until you get the right feel." Here is where the creation of variety from the basic audio stems can occur — with different weighting, pitch changes, combinations of sounds, and so on.
Mr. Prum went on to talk about how he obtains his sounds in the first place, and in general, his answer was "Everywhere." While he uses commercial libraries such as Sound Ideas and Hollywood Edge, he also carries a Zoom H2 recorder in his backpack with him. He is constantly looking for and recording interesting sounds, which he uploads to his constantly growing 400 GB sound effect library. In some cases the developers themselves provide certain sounds to be used, but those particular sources often come with restrictions and/or do not always work for the intended application. Foley is also an option, and while budgets tend to be small, certain applications can make foley worthwhile: "If I were working on a game where characters were walking around, running around with specific outfits, I would probably have somebody foley all the movement for those characters because the sound designing is just stupid."
Discussing the role of coding, Mr. Prum said, "I am not a programmer, but one of the most important things in games is knowing how to be able to do some coding, because it is just so valuable to be able to go in and get stuff done." Mr. Prum has done some work in PHP and JavaScript, and noted that while C# was a core language, it was not one he was yet conversant in. As an example of where being able to code makes the work easier, he then showed the group some sample code that he wrote for a routine called "Play Sound Trigger." This routine is a bit of code Mr. Prum added to have the program display information about an asset upon playback when the program is in debug mode. Being able to get that immediate feedback on whether something is working is key to producing work quickly and accurately, and since the development environment lacked that functionality, being able to quickly code a routine to handle it saved a lot of time.
Mr. Prum also reflected on what has gotten better in video game sound design: "I would say the sound quality has gotten spectacularly better. When I started it was disgraceful. When I came out of school, I didn't want to work in video games. I actually turned down a video game job because I did not want to work in a fishbowl with headphones making 8-bit sounds. I wanted to work in film where we could make good quality sounds." But the opportunity to work at Lucas, even in video games, convinced him to give it a try.
Oddly enough, a game must be playable WITHOUT sound, but the statistic show that on mobile phones, for instance, 70% of users will play with the sound on. Sometimes the user is allowed to turn off the music separately from the sound effects and dialogue, and other times not. As Mr. Prum sees it, his job is to create an engaging enough audio environment that the user will want to leave the sound on.

Written By:

More About Los Angeles Section

 
Facebook   Twitter   LinkedIn   Google+   YouTube   RSS News Feeds  
AES - Audio Engineering Society