Musical Programming
Chapter 13
Chapter Select

Musical Programming

Obsidian's artists and designers knew how to make their Infinity Engine-inspired game look like the real deal. The audio team's job was to make it sound just as authentic.

8

Tools of the Trade

Justin Bell’s audio booth is a musical wonderland. All the tools needed to write an epic soundtrack can be found between its insulated walls. A digital audio workstation, or DAW. Sound samples for every conceivable instrument. Sketches, illustrations, character models, and pre-rendered backgrounds to inspire leitmotifs.

Those tools are robust, but they come later in Bell’s process. Each of the game’s songs began miles from Obsidian and his soundproof booth during his daily commute.

Bell discovered years ago that listening to classic and videogame soundtracks while he trundles along in Los Angeles’ morning traffic puts him into a Zen-like trance. Entering that trance transports him out of his car and into his head. “Ideas just spring to mind. I think it's because there's no other distraction other than getting to work,” Bell explained.

He paused. “Other than not crashing into other cars and dying.”

Before he knows it, his fingers drum against the steering wheel. Then he’s humming a tune he plucked out of the ether. When he hits on something special, his hand fumbles for his iPhone and opens his recorder app. “I'll record myself singing. A lot of the music I wrote for Pillars was based on me singing horribly into my phone, then going into work and transposing that into piano music. I did that on Pillars 1 and Pillars II quite frequently,” he said.

Sound is Bell’s domain. He is Obsidian’s audio director, which means every soundbite, from the ka-chunk of a sword biting into a wooden shield to the themes that stream through players’ speakers or headphones, falls under his purview. If he didn’t create them, he scheduled and tested them for quality.

“That suits my personality, in a way, because I just like making games and making sound for games. In my position, I'm able to be involved in everything one does when making sound for a game. I really do love the whole process: sound design, voiceover, music, it's all very gratifying to me,” Bell said.

Bell’s duties on Pillars of Eternity began in the summer of 2012, when executive programmer Adam Brennecke and lead character artist Dimitri Berman recruited him to the Kickstarter team. Brennecke blocked out a script for the crowdfunding campaign’s trailer and gathered materials for filming. The script and materials went to Berman and Bell, who reworked them into a narrative complete with a stirring score that offered prospective backers an aural peek into the soundtrack.

“Dimitri and I worked really closely to choreograph every transition, every moment of the Kickstarter video to make it really feel like you're witnessing this narrative,” Bell explained. “It's not just a trailer. It's more the story of Obsidian and this product we're trying to make.”

Justin Bell. (Photo courtesy of Obsidian Entertainment.)
Justin Bell. (Photo courtesy of Obsidian Entertainment.)

As soon as it became apparent that the game’s Kickstarter was on target to surpass its funding goal in record time, Bell ensconced himself in mapping out a full score. He worked based on materials such as illustrations and write-ups that described regions, which were made up of smaller areas such as wilderness, towns, and dungeons.

“We'd scope that way, but once you start executing and putting those tracks into the game, you realize that some pieces may wear out their welcome, or don't feel appropriate at times, or your scope was too small,” he admitted.

His estimates usually fell short as development continued and the particulars of regions became clearer. “We added additional tracks as needed in order to flesh that out, and also to support what players are doing rather than doing it at the macro level of regions,” he admitted.

Bell knew that Pillars of Eternity would call back to Infinity Engine games in several ways. One, suggested by lead area designer Bobby Null, was to write combat tracks—motifs that play during encounters—that evoked Infinity Engine games. Bell latched on to the concept. One combat track was reminiscent of Baldur’s Gate, another called back to Icewind Dale, and so on. “That was a deliberate decision, because we wanted people to immediately recognize [classic games] when engaging in combat,” Bell said.

The rest of the game’s music was matched to particular areas or characters as a way to assist Pillars of Eternity in establishing an identity all its own. Obsidian’s artists supplied Bell with plenty of reference material. “For me, as a composer, the thing that really resonates with me is the game itself, and the world, and the artwork. I could hear a story or see evocative artwork, and I immediately get a feeling about that. And I just go with it.”

Bell conferred with lead narrative designer Eric Fenstermaker to tie his music into narrative themes at junctures in the story. “We talked through the plot and what the mood of each particular area would be,” Fenstermaker recalled, “so it was basically just me giving context, going over what had been requested in the area design docs, maybe sharing some general thoughts on what instruments or instrument sections might work the best, but nothing too constraining. We'd discuss, then Justin would run with it.”

Justin Bell's audio booth at Obsidian, circa development of Pillars of Eternity. (Photo credit: Obsidian Entertainment.)
Justin Bell's audio booth at Obsidian, circa development of Pillars of Eternity. (Photo credit: Obsidian Entertainment.)

The more Bell learned of the Hollowborn crisis—children being born without souls—the more it resonated with him as a father of five. “I was able to harness my emotional reaction to that narrative and tried to represent that in the narrative. The way I like to describe the score of Pillars of Eternity is it reflects the state of humanity in the world of Eora. It's less about the moment-to-moment of quests, and more about the people of Eora and the tragedy they’re enduring.”

As the dust of production settled, Bell settled on composing tracks for different areas and themes. “Obviously we had way more areas than we could afford music tracks, so often there'd be a primary area that Justin was writing for—which the track title usually referenced—and then the piece would get used in other areas where we thought the mood was appropriate,” Fenstermaker said.

Dyrford, named after the eponymous region, was the first piece of music Bell wrote for the game. It was written and brainstormed during work on Pillars of Eternity’s vertical slice in the summer of 2013. His approach was to capture the downtrodden state of a rural people who had seen better days. “I don't know if I would characterize Dyrford Village as a backwoods town, but it's just hanging on. The Hollowborn crisis is hitting them hard,” Bell said.

Bell consulted materials illustrating Dyrford Village to get a grasp of its layout and the pace at which players might move through it. “That will determine a lot of things, like the music's tempo and meter, and how melodic things should be, what the harmonic rhythm of a given area should be. Our game is a slow-paced game, and it was important to match that feel that you have while clicking around and exploring the environment,” he continued.

No matter the theme, Bell worked and reworked it in his Obsidian’s audio booth.

Fine-Tuning

“A good audio booth is one where, ideally, you don't hear the people outside your office and they don't hear you,” Bell said.

Obsidian’s audio booth meets his main criteria. Its walls are double or triple the size of every other wall in the office. The booth’s acoustics are neutral so that he hears only the sounds he recorded: no echo, no feedback. “You want the actual sound, as it is, to come through your speakers and into your ears, unmodified by the environment, so you can make critical decisions about what you're listening to.”

After several mornings’ worth of humming and singing into his recorder—the app like a digital vault Bell refuses to open for any ears except his own—he closed himself in the booth and got to work on Dyrford. He started the way he starts every track he writes—with the piano, the first instrument he learned to play growing up. His piano was digital, a library of notes stored in his DAW. Notes fall into place with keystrokes and mouse clicks.

Bell was a leaf on the wind during this phase, content to go where the music carried him. After several days, he stopped and sifted through his first draft. “Frankly, a lot of it's garbage. It's me hunting, pecking, listening to my inner ear, hearing where it leads and trying different things. I'll stumble upon something I like, and I'll riff on that and see where that leads.”

Like writing the first draft of a novel, getting material down on paper is the most arduous part of Bell’s process. The next step is sifting through his material to find something usable. “It's still a lot of work, but the germs of those ideas, the seeds, are established,” he explained.” Then it's a matter of taking those components, getting rid of the garbage—I'd say maybe seventy percent of it, which I would never want anyone to hear—and then cleaning that up.”

The result is hours of formless sound, still composed in piano notes. He breaks it into bits and pieces and rearranges them to hit on a structure. “Sometimes this thing I did on day two, hour nine, was really great, so I'll pull that into these other ideas,” he said. “You're mixing and matching these motifs you've developed, and you try to piece them together. That process informs what needs to happen to transform that into music.”

Days later, he winds up with seven to eight minutes of piano music. Next, he determines where to keep the piano, or where another instrument might serve better. Bell refers to instrumental choices as orchestral colors, the palette from which he paints. He choose orchestral colors using three computers. The first is his DAW, the software that lets him compose music. the other computers brim with samples of violins, flutes, clarinets, trumpets, tubas, saxophones, every flavor of percussion.

Every sample consists of a single instrument recorded at a different articulation. “They record all these and create a library of samples, and you can use it to get an approximation of live musicians. I effectively have a fake orchestra at my disposal with this three-computer setup,” Bell said. “Samples can be very convincing. There are some people who are, if you listen to their work, you would not be able to tell they're using sample libraries. It's an art, a black magic, to doing that kind of thing. We call it music programming.”

Adding orchestral colors to a track.
Adding orchestral colors to a track.

A finished piece of music programming is known as a mockup. Once Bell had a mockup for Pillars of Eternity, such as Dyrford, he had to decide if it should be recorded live or rendered through samples. “On Pillars 1, we did have a budget, and we decided we would record a good portion of it with a live orchestra,” he said.

Bell was honored and excited to partner with East Connection Music Recording Company for Pillars of Eternity. East Connection is spread across three studios in Budapest, Hungary, and offers services such as recordings featuring full symphonies or specific sections such as strings or choirs. Soundtracks are then recorded in a studio.

Before East Connection could record, Bell had to put his music through orchestration. The process entails finalizing his orchestral colors: this melody should be played by woodwinds, that harmony by low brass. His composition was sent to an orchestrator, who prepped it for the live orchestra so that their recording would match how Bell imagined the piece sounding in his head.

“The drawback to having virtual orchestras is you can cheat,” he explained. “You can do things instruments don't normally do very well: You can play outside their range, or play something too fast, or do something really difficult, or make combinations that don't sound that good. Maybe they sound great on a computer, but it doesn't sound great where live musicians are playing together. An orchestrator’s job is to say, ‘What's the composer trying to achieve with the decisions they made in the mock-ups?’”

His orchestrator produced a conductor’s score, a single document containing every part of a track, and then divided the document into separate parts for the musicians at East Connection to perform. Obsidian’s budget did not permit Bell to orchestrate his entire score. He decided to have them record wilderness, primary, and main areas, including Dyrford.

Bell was thrilled with East Connection's work. Just as exciting, Obsidian producer Katrina Schnell was able to contribute her musical talent by playing all solo violin parts.

Second Verse, Different Than the First

Tracks such as Dyrwood were arranged to draw players deeper into the melancholy of Pillars of Eternity’s narrative. Others were written as reprieves from the crushing sorrow of the Hollowborn crisis.

Defiance Bay is the first of the game’s two large cities. Players will feel inclined to explore its streets and shops, and meet new characters. Bell’s eponymous track played to that spirit of adventure and discovery. “The idea was to lighten the tone a little bit, because this is a center of humanity, and there's boundless possibility in terms of the quest givers you'll find and where the game opens up,” Bell remembered. “We lifted the tone slightly so it wasn't as dark as what you'd hear in Gilded Vale or the Dyrwood.”

Other tracks informed players that a certain character such as the rogue animancers Thaos was either nearby, or had a hand in influencing the events unfolding on their screens. “I'm a big fan of doing character themes, so we talked a bit about that,” Fenstermaker said of his conversations with Bell. “We were limited in our resources that we could apply there, but Justin did one for Thaos that I thought was just a fantastic encapsulation of the character. There's a version of it that's this very raw violin solo and I think it's my favorite bit of music in the game. I ended up doing a lot of critical path gameplay scripting myself, and I made damn sure to bust out that Thaos theme in its various forms as often as I could to help some of those big moments sink in emotionally and make them all feel tied together.”

Bell worked for years to achieve results like Fenstermaker’s reaction to the Thaos track. One of the biggest challenges he faced was the budget Obsidian dedicated to music and audio. “They say necessity is the mother of invention. I'm a big believer in that. As it pertained to Pillars and its score, and writing music in general, knowing constraints helps me to define what that blank page needs to turn into.”

Budget affected every phase of musical composition. Two of those were duration and repetition. Baldur’s Gate and other Infinity Engine RPGs had looped their tracks in such a way that players would—hopefully—fail to notice when a song circled back to its beginning. “The way things were implemented in the Infinity Engine games were often decided by limitations of the hardware systems available to those developers,” said Bell. “It was more acceptable to have highly repetitive music. I think that's always been a challenge for video game music anyway, but the constraints of the hardware meant that's what players were presented with and accustomed to at that time, so they accepted it.”

Each track for Pillars of Eternity was composed organically, going as long or ending as quickly as needed. Bell favored longer durations so that players would stay immersed in a moment rather than be yanked out of it by a riff that repeated too often.

“What ended up happening was a lot of very chunky, five-, seven-minute-long pieces of music that are pretty durable because they're so long,” he explained. “It's not repeating frequently enough to identify when repeats happen. It sort of glosses over the fact that it's just one piece of music.”

Spending more than a week or two on a single track was atypical. Eora, the game’s title theme, received the most attention at nearly one month of production. It would be the first track players heard when they booted up the game, and needed to set the stage. Combat themes tripped Bell up, too. “We didn't have endless amounts of engineering support to help us create music systems that would be more complex than just triggering music when you're in combat, or when you're not in combat. I struggled with that a lot with combat music because I found that while I was trying to avoid repetition with exploration music, I was unable to avoid it with combat music.”

Most combat themes begin with the same intro riff and then segued into passages inspired by one Infinity Engine game or another. Recycling the intro drove Bell crazy, but it was the best he could do given his constraints. “When you're making a game, you have to weigh the priorities of things. If the game is crashing but we have repetitive combat music, getting the game stable will always win that argument, and deservedly so. It's funny because it really bugs me, but there are a lot of people who really love it. And I'm like, ‘Oh. Really? Really? Oh. That's cool. I love that you love that, but I don't.”

Fenstermaker, however, genuinely appreciated one combat track in particular. “Combat Music D. Holy hell, Combat Music D. Goddamn,” Fenstermaker gushed, saying that the track competed with Thaos for his favorite.

The last track Bell composed was as important to players forming a positive first impression as the title theme: Encampment, which plays while players explore the game’s first area. In a way, Encampment was even more critical to grabbing players’ attention. Players who navigated the game’s main menu quickly would barely hear the title song. But the Encampment area was where they would take their first steps, and would thus spend more time

“That first step, that first impression, is important not just for me, but for everyone working on the game,” Bell added. “That's the moment where games live or die. If you're unable to give players a good reason to continue playing, a lot of them will stop after that first experience.”

Encampment settles over players like a blanket. It’s warm and whimsical, inviting them to settle in for an epic story—and then yanking that blanket off. Players receive their first quest and go off to explore. As they’re hunting for berries and learning their way around the game’s controls, the fog of war peels back to reveal brutish tribesmen enemies. Later, players experience their first Biawac, storms that rip and tear souls from their bodies.

“That whole sequence of events is carefully choreographed,” Bell explained. “We had a sequence of music that ramps up in intensity until you escape into the dungeon away from the Biawac. The focus really was making sure that tone would suit each one of those states as it ramped up in intensity.”

Bell reflected on the making of Pillars of Eternity’s soundtrack with pride and candor. It was a difficult process, but a rewarding one due to the opportunities to work with so many talented designers and musicians.

“In terms of expediency, it's way faster to just give music to a musician, and it's way more gratifying, as far as I'm concerned,” he said. “I love doing mockups. It's a nice challenge, a puzzle, and gratifying in its own way when you can pull it off. But there's nothing like collaborating with another musician. It's spine-tinglingly amazing.”

Emotional Enhancers

There’s a pecking order in game reviews. Graphics and story vie for the most attention. Then, in any order, come controls, art style—if it wasn’t tied into discussion of graphical fidelity—and mood, tension, and atmosphere

Justin Bell's audio booth at Obsidian, circa development of Pillars of Eternity. (Photo credit: Obsidian Entertainment.)
Justin Bell's audio booth at Obsidian, circa development of Pillars of Eternity. (Photo credit: Obsidian Entertainment.)

Sound design rarely gets it due. Justin Bell considers that a glaring omission.
Most players and critics notice music. It’s a spice that brings out the flavor of visual elements such as setting and gameplay. Sound design is too often framed as a soundtrack’s bastard offspring. “Only in very exceptional cases where the audio is just phenomenal will it be brought up,” Bell said. “I think that sort of mirrors the state of the industry and, to some degree, entertainment in general, where sound effects are a thing that you feel more than you intellectualize.”

Sound design is all-encompassing. Every ding, whiff, pow, bang, bam, scorch, sizzle, footstep, and grunt of pain falls under its umbrella. Most players don’t notice it either. But without it, visual mediums just aren’t the same.

Soundtracks and sound effects feed off each other. Bell submits the finale of Star Wars: Episode IV – A New Hope for consideration. A stoic crowd of soldiers stare straight ahead as heroes Luke Skywalker, Han Solo, Chewbacca, and their robotic pals C3PO and R2D2 cross the throne room to Princess Leia, who awards them medals of commendation for their acts of bravery against the Empire. The soundtrack that plays, Throne Room, was composed by John Williams. Trumpets blare and drums pound in a swelling fanfare that has pumped-up audiences since May 25, 1977.

One of Bell’s favorite YouTube videos shows the legendary finale without Williams’ theme. Actors walk through a too-quiet hall, shift their weight from foot to foot, cough. Then Chewbacca screams. It’s supposed to be a roar of triumph. Devoid of the adrenaline and heroism the theme adds, the hairy companion’s roar comes out of nowhere. It, and the scene as a whole, comes across as awkward and ridiculous.

“The reason I bring that up is because audio isn't something you notice unless it's gone,” Bell explained.

During production of Pillars of Eternity, designers would test area where audio production was incomplete or had not begun. Sound effects were missing. Voiceovers had not been recorded. The silence left a void that swallowed beautifully rendered characters and backgrounds.

“The moment you add sound, it feels like a game,” Bell continued. “It's ironic that it takes sound to make a game feel like a game, yet we don't perceive it that much. Maybe it's because—and I'm not trying to be generalist—but we as human beings are visual. We, as sound designers, train ourselves to use our ears and actively listen. It's something you have to train yourself to do, and we, for one reason or another as a species, only do it for survival. When you hear a baby crying, that brings out something in your instincts that makes you react to that. Art is another thing entirely. That's why some people can listen to music and write an essay. They're able to put the music in their subconscious and actively think about something else.”

Moreover, audio adds a tactile quality to games. The satisfying crunch of a mace smashing through a skeleton’s brittle bones. The ding of a headshot in first-person shooters. The snap and crackle of a fireball as it burns a path through the air and engulfs an enemy on the far side of a battlefield. “It suddenly takes this thing that feels unbelievable to something you can respond to emotionally. Sound accomplishes a few things. It frames and enhances and reinforces emotional subtext,” Bell continued.

Bell proposed that swapping Williams’ fanfare from the end of Star Wars with a circus theme would completely change the scene’s emotional impact. Games are no different. “If I switch the music used in any scene from our games, I can manipulate the way people feel about what they're experiencing. It has that power.”

Bell is comfortable classifying audio as an additive element. “Really, what I think audio does is it brings this creation to life. It's dotting the i's and crossing the t's, making the player feel something.”

As with the soundtrack, Bell and the audio team viewed Infinity Engine RPGs as their benchmark for Pillars of Eternity’s sound design. They did not want to reinvent the wheel, but rather sand it down for a smoother ride. BioWare had developed Baldur’s Gate and its famous engine in the mid-1990s, when most PCs had fewer than 100 megabytes of memory and manufacturers like Intel were just beginning to measure processors speeds in gigahertz instead of megahertz.

“On Pillars, we didn't have those restrictions. We certainly had limits, but nothing like what they dealt with on IE games,” Bell said. “What that translates to in terms of end result is we could make more bespoke content, and make that content more robust.”

Examples of Pillars of Eternity’s robust audio design can be found in virtually any wilderness scene. Ambient noises wash over players: burbling brooks, roaring waterfalls, the mutter and song of birds and insects. Those effects were added by placing emitters, audio cues that can be attached to objects such as trees and rocks. Emitters can loop so players believe they’re hearing the continuous roar of a waterfall when they’re really hearing one effect played continuously.

“That isn't exactly new technology, but we can do that and make assets high quality without needing to compress them very much,” Bell said. “We could have more variations of them, and make them longer. We can stuff that whole scene full of as many details as we can imagine: frogs croaking, loons in the distance. Every Obsidian game has had loons, for some reason.”

Mixologists

Like everyone else at Obsidian, Bell and his team operated under restrictions. The audio department consisted of three sound designers: Mikey Dowling, Adam Lehwald, and Zachary “Zac” Simon. Bell was an inconstant fourth member since his plate was filled with tasks such as the soundtrack. “If the other folks on the team were busy doing their thing and I saw something that needed to be there, I'd just add it. Given all the things on my plate at any given time, it is tricky to carve out time to make sounds, but I do make an effort to do that,” he admitted. “Yes, I write music, and yes, I'm an audio director. But I'm also a sound designer, and it's gratifying to be able to do that.”

Mikey Dowling. (Image courtesy of Obsidian.)
Mikey Dowling. (Image courtesy of Obsidian.)

The team organized Pillars of Eternity’s sounds as a hierarchy. High-level categories sat at the top: environments, characters, user interface, voiceovers, objects, music. “Those are the most important and distinct ‘food groups’ of sounds,” Bell said.

From there, the sound chart became more granular. The characters category had sub-listings such as vocal, movement, and abilities. Vocals for a monster-character might consist of growls and screams of attack, pain, and death, as well as taunts, idle babbling or chit-chat, and cries of victory. “For all of those, they need to have impact, and for those impacts, you need to determine to what level of granularity you want to represent them,” Bell explained.

He rattled off weapon sounds as a collective example. “Does an axe hitting a shield sound different than a sword? It probably should, so you should consider that, if you have time. As you can imagine, if you take twenty different weapons, suddenly that's twenty times three, you've got sixty categories of sounds to make. And you can't just make one version of a sound for each category. You need at least five or six—ideally, you'd go for twelve or thirteen—so it doesn't sound repetitive.”

Not all sounds should be multiplied for variety. A universal sound for a special effect can carry enormous weight. “This is a long-winded way of saying, when planning a game, start at the high-level. The trickier part is, how do you make those sounds so they enhance the game as it's being designed, and are meaningful for players, and how will that shape their experience? That takes a lot of iteration: trying things out, playing the game, talking with designers and artists and the game director, trying to figure out what the vision is for the audio and how it should enhance the game.”

Mikey Dowling was the first designer on Bell’s team. That gave him the privilege of creating the game’s first sound effect. Dowling was eager to dig in. He had designed post-processing effects for vocal tracks in previous titles such as Fallout: New Vegas and Alpha Protocol, and helped connect finished sounds to animations and other triggers in South Park: The Stick of Truth. But Pillars of Eternity was the first project where he was given the responsibility of designing audio across a large swathe of a game.

The first sound effect Dowling produced, and the first sound effect for Pillars of Eternity, was not epic or dramatic: the clatter of steel on steel, the roar of one of the game’s ten types of dragons. In fact, it was as innocuous as it was essential to any high-fantasy RPG.

“We use torches to set mood for pretty much everything. They're prolific,” Bell said.

Dowling sorted through Obsidian’s sound library to find the perfect hiss and snap of flame. “It was that tired, old tradition of making it not sound like frying bacon, or a raging campfire,” he said.

To the uninitiated, creating or assigning the perfect sound to a torch may seem like a chore. Not so, argued Dowling. “It was a delicate balance depending on the size of the fire you were making.”

The sound for a torch a human hand might carry could be recycled for reuse on the sort of torches that hang in castles and fortresses. Larger fires such as the huge braziers in Defiance Bay called for a bigger noise, as well as other technical considerations. “We had to figure out how to make sure that sound didn't play during the daytime, because those braziers are only lit at night.”

Dowling handled environmental audio such as torches. Other designers took charge of characters, abilities, and other categories. “You have X amount of time to get sounds in, so either you go out and record a campfire, or light a stick on fire and run around for a little bit,” Dowling explained. “That wasn't possible, so I went through library sounds, listened to some stuff we'd done in previous games to see if it could work. Every game gives you new advantages, so I was able to redo a lot of stuff and go from there.”

Creating new assets by mashing together pre-existing assets is called kitbashing, a technique used by sound designers, artists, and every other discipline of game development. “Explosion sounds, for example,” Bell said. “You can get them on your own, but it's, uh, dangerous. Libraries are great because you can just go online and search for something you need: ‘I need a rock slide,’ and you'll find someone out there who's recorded all the different ways rocks can slide. It's a time-saving measure, but it often results in very cool things, and frees you up to do more bespoke content without having to make a bunch of new stuff.”

One such piece of bespoke content could be a growl. A sound designer could growl into a microphone, then load the sound in a DAW and manipulate it. “That's often more direct and leads to more fruitful discoveries than picking through a library,” Bell admitted.

“I also feel that if you see something that makes you think, I need to do the sound for that, It's not selfishness,” Dowling added. “You just want to do it because it's cool. He's not going to say, but Justin really wanted to do all the dragons in Pillars.”

“I was selfish and stated early on that I wanted the dragons,” Bell confessed with a laugh.

Dragons have weight in Pillars of Eternity. Their ponderous footsteps shake the screen. Their roars and screeches freeze blood in veins.

Dowling took as much pride in sporelings, gigantic mushrooms able to stand upright thanks to thick coils of roots around their legs, as Bell did in his dragons. “I really just wanted to do those things because I loved their animations and had an idea of what those things should sound like,” Dowling said.

“Actually, the sporelings are some of my favorite creatures in the game,” Bell chimed in. Sporeling lore dictates that they hunger for souls. They feed by binding to hosts and absorbing their souls like parasites. Their gait is unsteady, speaking to their large size and, perhaps, to their hosts’ futile attempts to take back control. “The way they turned out, they have so much character. They're awesome and adorable at the same time,” Bell continued.

Bell and the audio team had so much fun designing sounds that they had the opposite problem as developers of classic RPGs. “The cheesy way of putting it is, ‘With great power comes great responsibility,” he said. “It's easy to overdo your work, and if you do, you will have a programmer in your office saying, ‘Can you make this thing smaller, or can you do less? You're blowing up our memory footprint.’ That happened on Pillars 1. Time and resources are always going to be your primary motivating factors when it comes to level of detail.”

Instead of compressing effects, the team would dial back audio so that it was still effective and evocative without being too long or complex. As Pillars of Eternity took shape, they shifted from creating and polishing audio to closing down areas—the process of finalizing content. The checklist for shutting down any region was as daunting as the chart for creating audio: making sure every creature had all its sounds, emitters were in place, doors of every size and configuration—big, small, big double doors, small double doors, wood, stone—sounded the way they were supposed to sound.

Mixing is part of shutting down audio. “That's when you're featuring sounds and making sure the player's attention is focused on the right thing,” Bell said.

“You’re getting everything in, making sure all the sounds sound really good together, that nothing is getting overpowered,” Dowling added. “You don't want to open a door and somehow it be louder than the voiceover that's playing, or the music.”

“It's something that generally has to wait until the very end. You can make mix decisions on the fly and get rough mixes in place,” Bell continued. “You should do that, because otherwise the game will sound like a mess. But you also need to get to a point where everyone's hands-off, and the audio isn't a moving target so you can get stuff to gel. Games are chaotic, and the way sound is triggered is very chaotic. If you don't control that, it sounds like a cacophony.”

Upstream

Photo credit: East Connection Music Recording Co.
Photo credit: East Connection Music Recording Co.

At times during production of Pillars of Eternity, Bell and the audio team grew frustrated with their place in the hierarchy of development. They often had to wait until visual components were in place before they could do their part. For instance, characters needed rigged models and animations so audio designers could study those animations and stick sounds to the right frames.

“There's a lot of times when it makes sense for us to be downstream, as I call it,” Bell conceded.

Still, he continued, audio should fall into step with every other discipline when possible. “You can get yourself in trouble by waiting too long because you run out of time. Then you don't have time to iterate and try new things. As a creative person, I don't think that's a great way of doing something. You don't want to be stuck with something that you as a creator aren't happy with.”

In an ideal world, Bell and the audio team would have jumped into Pillars of Eternity’s stream at the earliest opportunity. Instead, they filled in gaps as soon as it was possible to begin working. As soon as environment artists submitted a gray box of an area—say, a desert—the audio team created or assigned samples for elements that the region would contain: dust storms, footsteps on sand dunes, footsteps on stone surfaces covered in sand. Their goal was to get something in place. “The moment a character is functional in a game—moving around, has AI, and is kind of doing what designers imagine it to be doing—we should be putting sounds on it right away, with the expectation that once the character's animations are finalized, we're going to have to go back and finalize our sounds.”

After all, the sooner audio is integrated, the sooner a game feels like a game.

“I think it's inherited to some degree from the grandfather of game audio, which is the film industry,” he said. “Not all films are made the same, but audio is what's called postproduction, because you can't really add it until there's an edit of the film In its least ideal form, audio is brought on toward the end of a process. You want audio to be in lockstep, making creative decisions throughout the project, so you can reap the advantages of thinking about a thing over and over again. That's what makes stuff good: When you're able to be involved from day one. It makes sound more meaningful.”

Hello, Meet Lola