Hurry Up and Wait
It wasn’t the eleventh hour, but it was close enough. Too close for comfort.
In November 2014, Obsidian’s audio team had four months to finalize, record, and touch-up voiceovers before Pillars of Eternity was due for release. “If we weren't confident, and the story wasn't locked down, or quests weren't flowing, or things weren't making sense—if we would have prematurely started hiring actors, we wouldn't have been able to revisit voiceovers. It wouldn't have been financially feasible,” explained Justin Bell, the studio’s audio director.
Some VO had already been recorded. The majority was a work in progress. “We had a vertical slice that was effectively the intro, and we did record some actors for that. That was more a proof of concept: How is the game actually going to sound?” recalled Bell.
Bell and his team could not record the bulk of the dialogue until the narrative designers had nailed down their scripts. For Eric Fenstermaker and his narrative team, putting their stamp of approval on scripts was not as simple as running spellcheck and circulating documents. “For the base game, our budget was such that we didn't have enough money to voice everything, but we had enough to voice the bulk of our more important dialogues. It put us in a strange position,” Fenstermaker said.
“Since we had a low VO budget, our designers had to go in and call out which lines specifically in the script they wanted recorded,” Mikey Dowling added. “That meant another pass through everything to decide, ‘Well, Edér and Grieving Mother will say these things, but after that the player will just have to read, but the characters may speak up again here.’ That process of elimination across lines was its own task.”
That fall, every narrative designer read and played through their quests, calling out which lines of dialogue should be spoken and which ones players would have to read. “It starts to fall in this uncanny valley where players are hearing enough lines in a row that they start to expect the next line in the series to be voiced, and when it isn't, that can be jarring,” Fenstermaker admitted. “Not really understanding this yet, the initial heuristic we used for choosing lines was to sort of pick out the most important ones in the dialogue and make sure they got voiced.”
The near-complete absence of dialogue had been one of the reasons Adam Brennecke and Josh Sawyer had implored Feargus Urquhart to delay Pillars of Eternity earlier that summer. Even then, the delay had barely left Bell and his three-person audio team enough time to book, record, and integrate voiceovers. “The problem with recording over November in December is every actor goes on vacation,” Dowling said. “From Thanksgiving through the New Year, you lose access to a lot of your talent. That's why recording stretched into January.”
The actors Obsidian had hired were doing a great job, but Bell had worried the game’s audio may come down to the wire. “It's tricky because when you're making a game in those final months, everyone's trying to finish their content and make it stable. Whenever you add anything new, you risk destabilizing the game and adding new issues, which equates to more work. We'd done some testing, but you never really know, until those final assets are in, how a game is going to react to suddenly having to do something it's never done before.”
By the time Pillars of Eternity II: Deadfire’s Fig campaign went live in January of 2017, the audio team were among the many developers benefitting from improvements that the programming team were making to OEI Tools, the proprietary toolset every department used to create, integrate, test, and polish content. “Making RPGs has a lot of implementation implications,” Bell said. We were able to get a lot done on Pillars I, but it was frequently not highly optimized. There were a lot of limitations that we had to work around in order to support the vision of the game.”
The problem Bell and his team had faced on Pillars of Eternity was a lack of optimization. Unity was a powerful engine that had fallen short of the unique needs of Obsidian’s audio engineers. One of those needs was connecting sound to hooks, in-game events that trigger a designated sound. “We had some basic hooks, but they were pretty rudimentary and sometimes arduous to work with. The reason why was, similar to VO, we couldn't afford the programming bandwidth to make it more complicated than, ‘This hook triggers this sound,’” Bell explained. “You're spending so much time trying to get sound to trigger that you can't spend much time making it sound as good as you want it to.”
Another issue was that audio engineering was spread across multiple interfaces. Designers had had to create a sound on their DAW, export it to a folder, open the folder, grab the data, open the Unity engine’s interface, and connect it to a hook. It was hardly a one-and-done task: An ability, for instance, calls for five to six categories of sounds, each category breaking down to a variety of sound effects of the same type—say, a dozen grunts or clashes of steel against steel—so players would not tire of hearing the same effects over forty or more hours of play.
“It could take up to six hours to get everything working the way you wanted—for one ability, of which there were hundreds,” Bell said.
Bell desired more of the programming team’s time during preproduction of Pillars of Eternity II, and Nick Kinkade stepped up to grant that wish. One of Obsidian’s newer coders, Kinkade sat down with Bell to understand the audio team’s tasks. Migrating game data from Unity to OEI Tools helped tremendously. “One thing we built internally was, basically, a database of all the data in the game that you can easily manipulate and work with,” Brennecke explained. “We always found that for making huge RPGs, that was a good system. When we were working with Unity, it has some of those features, but we found that for making RPGs, specifically, having our own tool to handle that stuff was a lot better.”
So did the use of Wwise, a software package for audio creators in the games industry. Wwise streamlined the team’s tasks by expediting functions that Unity had forced audio engineers to wade through.
“You can add it to Wwise and it pre-wires all these things,” said Bell. “No more opening folders and windows. That entire process, from exporting to having it working in the game, went from five to six hours, to as fast as ten minutes or even less.”
Cutting implementation from hours to minutes gave Deadfire’s audio team a wealth of time during the sequel’s development. They used their spoil of weeks and months to create thousands of new pieces of audio. “The environments went from forests to a jungle paradise in an archipelago. The setting was so vastly different that in the end, we had to make a lot of unique content,” Bell said.
Some audio from the first game made the jump to Deadfire, such as the sound effects Mikey Dowling—who had transitioned to handling the studio’s PR efforts full-time—had created for torches. However, much of the audio stayed behind. Deadfire’s animations and environments were so different that retooling Pillars of Eternity’s sounds for the sequel would have entailed more work than making new ones.
Before, Unity had dictated audio design because of its time-consuming and complex process. On Deadfire, OEI Tools enabled the team to get more creative. The game’s dynamic weather, for example, featured audio for wind and rain that could be adjusted on the fly in-game depending on the severity of weather at any given moment. “For every weather effect—rain, for example—we would have very light rain all the way to torrential rain,” said Bell. “It was a sliding scale. You could smoothly cross-fade between states, and you could have varying degrees of weather. If light rain is one percent, and torrential rain is 100 percent, and the current rain was fifty percent, it would sound appropriate. It was reactive. Same thing for wind.”
Hearing storms pick up or die down reinforced the tension or calm of environments. When rain drizzles, characters may chuckle or grumble about getting wet. If it picks up, those characters will cover their heads or run for shelter.
“We also changed the sound of footsteps so that when you're walking outside and it's raining, footsteps will sound damp,” continued Bell. “We tried to key in to the weather and make it feel like the world not only changed visually, but in other ways you would expect.”
Whereas a radical overhaul to the process of audio design had been needed to make Pillars of Eternity II, Justin Bell saw a need for connective tissue between the soundtracks.
Deadfire hinted at motifs from the first game, but differed radically in other ways. The first game’s story had centered on a renegade animancer who had harvested souls from children to further his own ends. While the sequel had serious stories and characters, its themes of sailing the open seas and swashbuckling with pirates made it intrinsically more lighthearted. “It's so vibrant, and it's not as dark,” Bell said. “Melancholy really isn't there at all, even though the circumstances surrounding the story could be dire. It's a tropical paradise full of lush, blue water. I think the artwork influenced me as much as the narrative of the game.”
This time around, Bell was less concerned with paying homage to Infinity Engine titles. Deadfire’s content informed his composition. “I didn't pay attention to other music. I literally listened to no other music while I was writing the score for Deadfire. I only wanted to see the content, understand the narrative, and react to that as an artist. It's this sense of adventure. You're sailing around on a ship, and there are pirates, and there's this sense of boundless possibilities. You're sailing the open seas, and there's this airy freedom to traversing the map, and there are expansive cities and hours and hours of content.”
Bell went all-out for Neketaka, an ancient, pyramidal city made up of seven districts. “There was a lot of effort placed on making the various districts of Neketaka feel distinct. One thing I wanted to change on this game was to make the music feel a lot more colorful and a lot more vibrant in the same way that the visuals are more colorful and vibrant in Deadfire.”
While Bell took pleasure in composing leitmotifs for encounters and sundry areas, another item on his wish list was fun and immersive music for inns and taverns. He wanted players to walk into establishments like the Hole, a tavern located in Neketaka’s Gullet district, and hear music played by live musicians instead of instrumental samples. “It adds a lot to the world, makes it feel more alive. I can do my best to fake what it sounds like to play ancient, traditional, European folk music, but there are musicians out there who specialize in this. I could spend a bunch of time at my computer trying to fake it, or I could work with musicians who are super talented, get inspired that way, and have something that's a million times better than anything I could make on my computer.”
Bobby Null came to Bell with a suggestion. He had been playing Mount & Blade, a fantasy-themed RPG featuring music played by a German orchestra called Frölich Geschray, and mentioned to Bell that they might be a good fit for Deadfire. Bell reached out to Frölich Geschray and worked out a deal for the group to record five pieces of music. “I gave them the sheet music for those pieces and said, ‘Just have fun with this. See where it leads.’ They took that music, went into the recording studio, and sent me back the recordings,” Bell remembered.
Frölich Geschray captured Deadfire’s more upbeat tone perfectly. One piece, Farmer and the Fox, is a playful tune that evokes a raucous atmosphere of flowing spirits and rowdy patrons. Skudrinka is equally lively, but with heavier percussion, demonstrating the group’s instrumental range.
Bell liked Frölich Geschray’s work so much that he worked with Obsidian to license tracks from their album “So Talks To Life” for use in the game. “They were really good and sounded incredible, and worked really well in the setting of Deadfire. They were very enthusiastic about it, and it was really great working with them,” Bell said.
All of the upgrades and perks of OEI Tools would have carried over to voiceovers, if not for the audio team’s ambition. One of the stretch goals for Deadfire’s crowdfunding campaign was double the amount of VO compared to the first game. Eager supporters hit the stretch goal on February 8, 2017.
Feargus Urquhart approved the budget for two times the amount of recorded dialogue, but the audio team had other ideas. On April 13 of 2018, more than a month after Pillars of Eternity II had been delayed, Obsidian announced that all dialogue had been recorded. No more listening attentively only to have to stop and read the rest of a character’s lines.
That was the upside. The downside of the bold move to record all VO was that the massive volume of extra work scuttled the audio department’s plan to record further in advance than they had on Pillars of Eternity. “Because of that, we recorded way later than initially planned. So, really, we did the exact same thing,” Mike Dowling said.
“To top it all off, we went from 5,000 lines in Pillars 1 to 33,000 lines, and we didn't start recording VO until February of 2018,” added Bell. “We were recording all the way up until April.”
Although the team ended up recording at the last minute again, the game’s delay in March had nothing to do with VO sessions or implementation. The audio team’s improved pipeline enabled them to integrate dialogue as it was recorded. Despite that, recording thousands of extra lines did take time.
Normally, a line of dialogue recorded for movies or TV shows consists of a few words: Hi, how are you? or How was your day? RPGs, however, tend to be wordy, and Obsidian’s are among the wordiest. “A Pillars line is literally paragraphs, so our friends at this recording studio call them ‘Obsidian lines’ when they're teaching their class, because from a voice actor perspective, you're given this huge chunk that, on your page, looks like you're about to read a monologue in a movie,” Dowling said.
According to Dowling, who had worked on audio teams for Obsidian projects developed between 2008 and 2015, the developer had recorded 5,000 lines for Pillars of Eternity, 11,000 in South Park: The Stick of Truth, 12,000 lines in Dungeon Siege III and Alpha Protocol, and a staggering 62,000 lines for Fallout: New Vegas.
Deadfire’s line count was approximately half the size of New Vegas’s—and yet equal at the same time. Deadfire’s 33,000 lines required actors to pronounce and hit the correct inflection for unique words that only existed in the Pillars world. “So,” Dowling explained, “when I say we recorded 33,000 lines for Pillars II, I think it's probably fair to say we probably recorded the same amount of dialogue we recorded for Fallout: New Vegas, even though there's a line disparity in the final count.”
That was around the time Bell and his audio engineers stopped counting dialogue in lines and focused on words, which was more accurate. “Line—what does that really mean?” Bell reasoned. “A line could be, ‘Hey,’ or a monologue, like an Obsidian line. What we started focusing on was words. I think we recorded between 300,000 and 400,000 words for Pillars II. It was a challenge for everyone working on the VO effort.”
The challenge posed by VO extended beyond the audio department. Narrative designers attended recording sessions to make sure actors hit all the right notes. Pitstop, the production company Obsidian worked with during recording, had to edit and remaster audio and VO against the game developer’s looming April deadline—later bumped to early May for reasons Bell and Dowling stated were unrelated to the tight time table for recording VO. There was no way one studio could handle such a workload, so it was split across five.
“The one benefit of doing mass VO that way—we did the same thing on Fallout—we started recording Fallout of June in 2010, and we shipped in October,” Dowling explained. “So, VO can come late, as long as you know that and mark every line that needs a post-processing effect.”
By April 3, days before the studio surprised fans with the announcement that every line of dialogue would be voiced, the effort was nearly complete. All that was missing were a few creature effects and a few lines—or rather, words—of narration.
“It was quite a lot of effort, a lot of late nights for a lot of folks to pull that off at the last minute,” Bell said. “I think it was hard on everyone, but I think everyone would agree that we should do it better next time, and make these decisions earlier, but we all feel it was a worthwhile thing.”
Bell and Dowling praised the programming team for their across-the-board upgrades to OEI Tools. “If we'd had to use the tools from Pillars 1 on Pillars II, it would have been impossible without hiring five to six more people,” said Bell. “The volume of audio content in Pillars II is significantly more. It would have been impossible to make the game in the time we did. I know that sounds like I'm exaggerating, but it's completely true.”
But the real heroes, they explained, were the employees who worked tirelessly for years to develop and fine-tune every sound effect, musical note, and word of dialogue in Pillars of Eternity II: Deadfire. “Something I think is pretty awesome for Deadfire, from an outside perspective looking in, is it's the biggest audio effort we've ever had at Obsidian,” Dowling said.
Approximately thirteen engineers across three outsourced groups contributed to sound design. Another studio, Technicolor, worked with Obsidian to record footsteps for every surface—and surface condition, such as dampness—in the game. Canadian studio Game On recorded voice reactions such as exclamations over rain and drunkenness in taverns. Half a dozen singers, including Bell and Josh Sawyer, collaborated with a singing group to record sea shanties. Roughly two hours of music was orchestrated compared to the forty-five minutes of orchestral music in the first title courtesy of the Budapest Art Orchestra over five days in Hungary; long-time collaborator and orchestrator Ryan Humphrey helped prepare musical scores for recording. Frölich Geschray wrote and recorded inn music as well as tense tracks for ship-to-ship battles. Sixty cast members for the VO effort, including the entire cast of Critical Role. Pitstop coordinated cast availability and edited recordings. And Justin Bell and his ace team of audio producers—Andrew Dearing, Dylan Hairston, Scott Gilmore, Zachary Simon, and Adam Lehwald—put in long nights and weekends.
“All told, we must have spent hundreds and hundreds, if not thousands of hours, on audio for this game,” said Bell, who came away enormously proud of the effort put forth by every individual he had worked with on Deadfire. “That's not to say we haven't put a lot of effort into any other game we've done because all of them have been a great effort, but in terms of the amount of people working on it, and the intensity of the effort for the project's duration is unmatched.”
Kaz Aruga considered himself the luckiest game developer in the industry. He had been promoted to lead artist on Pillars of Eternity II: Deadfire, and the sweeping improvements to OEI Tools were shaping up to deliver a work environment where artists would get to concentrate on creating beautiful artwork instead of tearing their hair out over uncooperative tools and systems.
But that wasn’t the best part. “I like helping people work together and come together, and I knew I could do that in this position, so I said yes, and I've been doing it ever since,” Aruga said of his promotion. “I feel like I'm lucky because the team I inherited is great. They're autonomous most of the time, and people need very little oversight. Without them, I would probably be screwed.”
Aruga’s purview extends over every type of art in Deadfire, from characters and settings to concept portraits and the game’s user interface. Upgrading company tools was the first step to helping his team top the effort they had put forth on Pillars of Eternity. “When we started Pillars II, we identified what areas of the pipeline needed upgrading. I was able to contribute a lot on the rendering side of things for the environment artists,” he said.
The computers Aruga and other artists had used on Pillars of Eternity hadn’t exactly pushed technological envelopes. Their rendering farm, an assortment of computers dedicated to converting 3D art into pre-rendered backgrounds, had been prone to running out of memory and crashing in the middle of working through rendering queues.
Aruga examined Maya, Obsidian’s software of choice, and identified a bottleneck. Polygonal set pieces such as a cliff face had been duplicated to save time, but duplicating all those polygons had bloated scenes and caused render farms to buckle. “These are, by the way, eighteen-hour renders,” he explained. “One crash sets you back two workdays, and that's just to get a render. That was not acceptable.”
On Deadfire, Aruga worked with technical artist Antonio Govela to eliminate or at least smooth over potholes. Their solution was to save reference points for where geometry should be inserted, cutting down on the weight of each scene and speeding up render times.
Such fixes occurred under the hood of OEI Tools and the Pillars of Eternity engine, a hodgepodge of adjustments and enhancements to Unity that Obsidian’s programmers had made over the course of building the first game and revamping their tools for the sequel. “On Pillars II, we had a lot more support to do the things we wanted to do, which has been great,” said John Lewis, lead visual effects artist on Deadfire. “But one drawback of that was because we had a much larger toolbox, there was a larger margin for error. We had to be very careful as far as performance went, and as far as getting carried away.”
Adam Brennecke wanted artists to get carried away, within reason. Development on Deadfire began roughly eight months before the Fig campaign launched in January 2017. That was the period during which the game’s programmers widened engine bottlenecks across every division. One of Brennecke’s jobs was to lead construction of a unified lighting pipeline that would elevate Deadfire’s visuals. “Pillars II was upgraded to Unity 5, and in Unity 5 they have a new rendering pipeline that uses PBR shaders, which offers better and more photorealistic shading and lighting,” he said.
A new feature in engines such as Unity 5, PBR is short for physically based rendering, a system of displaying a game’s environments and actors—such as characters, items, and other objects—under a unified visual style independent of the type of lighting in a scene. “I also had this grand idea of doing pre-baked lighting in Maya, rendering out lighting in Maya using Unity's light probes,” Brennecke continued. Light probes function as lamps, introducing high-quality lighting on moving objects in a scene. “Unity does that with their lighting system, and I wanted to see if we could figure out how to take that out of unity and put it into Maya. That was crazy, and somehow we were able to do it.”
PBR was one of many new features that modernized Deadfire’s graphical fidelity. Although engines such as Unity and Unreal supported PBR, making the transition was an enormous effort. “A choice had to be made,” recalled Dimitri Berman, lead character artist. “We could stick with the style we'd made [Pillars 1] in, or we could update our whole engine pipeline to this new lighting and graphics system.”
Berman and other artists were on board with any jump that would streamline development of art for Deadfire. Making such a technological leap would require a lot of work, but the benefits would be readily apparent to their players as well as Obsidian’s team. “The downside of that was we basically had to rebuild stuff from scratch. We couldn't just bring stuff over from Eternity as-is, which we wanted to do originally,” Berman added. “Knowing that, and knowing we still had to make a huge new game that had to be beautiful and awesome, we doubled down on making our tools more efficient.”
Eight months and another smash-hit crowdfunding campaign later, the art team had hit their groove. Although Berman recalled most processes on Deadfire taking approximately the same amount of time as processes on Pillars of Eternity, the fine-tuning enabled the artists to create more content—the same benefit the audio team had enjoyed. “In many ways it's easier to make stuff now than before. It looks better and is easier to make,” Berman said. “That's one of the reasons we wanted to switch up our pipeline. It's also faster to iterate and make sure stuff you put in the game looks the way you want it to look.”
Shoring up Pillars II’s art pipeline cleared the way for Aruga and other artists such as tools experts Antonio Govela to implement standards that facilitated better and fast art development. “Because I'm an artist and also a lazy artist, I know that when these standards roll out, an email is set, or maybe it's a team meeting, and they're mentioned once, and then leadership forgets about them, and the whole team forgets about them. So, I tried to meet the team in the middle,” he said.
Standards smoothed over deceptively complex speed bumps. Crafting small such as chairs, for instance, had aggravated Bobby Null and other designers on Pillars 1. Chairs were 3D objects that had to interact with other 3D objects and fit on a 2D backdrop: Anytime a scene included a character who could sit on a chair, the chair had to be set within the 2D background. “We have an automated system for that, now. That was one of our biggest constraints: Not knowing where something lived in 3D space,” said Lewis.
Toward the goal of removing guesswork and complexity from art development, the artists built a single chair according to an agreed-upon height and width. Those measurements could be copy-and-pasted, after a fashion, onto any type of chair in the game. Additionally, the artists rendered a single sitting animation that came with a guideline: No matter the type of chair an artist created, it had to be compatible with that animation.
“That's something I learned along the way that ended up being really helpful,” Aruga continued. “When you roll out standards, and because they can be tedious, managers should invest time weeding out busy work. Otherwise I wouldn't expect anyone to follow the standards.”
Codifying small details such as chair dimensions and animations gave Lewis time to enhance the sense of verisimilitude in each scene. Rays of light shine through windows. A haze of heat shimmers over vents in forges. “If you're in a dungeon, I make sure there's dust flowing through the air so when you walk by a torch, you can see a thickness to the air,” Lewis said. “Things like that that add subtle movement you may not consciously notice, but your subconscious tells you, ‘This scene is alive.’”
Instead of animating an effect, waiting for it to show up in the game, observing it, and then going back into his art program, Lewis took advantage of pipeline advances by testing and refining effects on the fly. “You can run the game while you're in an editor, and I'll build an effect while the game is running. If I make changes, I can paste them into the preview of the game. That's a good way to see how an effect is supposed to look with all of its post-processing effects and everything, while you're building it in real-time,” he said.
“We made a lot of pipeline upgrades and streamlined their processes so that a lot of things that could be automated, as it were,” said Aruga. “That way artists aren't doing any error-prone and tedious work.”
Reuse and Remake
Dimitri Berman knew going back to the drawing board had its advantages. For one thing, he and the character art team—still small and scrappy at just three artists, James Chea and Ian Randall in addition to Berman—weren’t stuck with most of the skeleton rigs and character models they had made for Pillars of Eternity.
“The only assets reused from Eternity I were some of our creatures. We still updated them, but at least we didn't have to remake them from scratch,” Berman said. “We had to remake everything to make it much higher visual fidelity, because in Deadfire you get a lot closer to your characters.”
Higher fidelity accommodated greater detail. Some headgear looked weird in Pillars of Eternity because hair had remained fixed in the same shape even when players equipped helmets, as if adventurers used an entire bottle of medieval hairspray before entering battle. “On Deadfire, they swap your hair out with a partial hairstyle for helms, which took a little while for us to make and implement. We didn't have an option for that on Eternity I. We just didn't have the resources,” Berman said.
Berman revisited the godlike with enthusiasm. He and his team shared a common goal of reimagining them without losing core features such as the head growths that distinguished sub- fire, death, and other sub-types. “I like to go back and revisit all our races, and see how much more unique we can make them, lore-wise and culturally, just to make them feel richer.”
While working on the godlike, Berman realized he had forgotten that fine details which looked great up-close in the paper-doll view did not hold up when seen from above via the game’s isometric camera. “The first nature godlike I made, the males had flowers in their beards. But when we looked at them in-game, it kind of looked like they had bird poop on their heads,” he said. “The hard part is when we're designing characters, how they're going to look, we take into account that you'll see them from two really different viewing angles: close-up, and from the top-down, faraway camera.”
With tools, Berman could do anything he wanted, but their development time table simply would not allow him to experiment to his heart’s content. One of his initial goals was to add scars, tattoos, and other physical characteristics to sub-races to set them apart. His team of three lacked the bandwidth for such detail-oriented work. “It's on the back burner for now, but some dwarves in Deadfire have piercings on their faces. We think that's a cool thing. I introduced stuff very gradually to separate all the [races] culturally, and I would like to explore that even more in whatever we do next in Eternity,” he said.
Berman explained that an important consideration when designing characters for Deadfire was to be wary of adding too many details. Not necessarily because the engine couldn’t crunch all the numbers, but because there’s a fine line between subtle detail and visual noise. Refining and removing on-screen clutter was a top priority. “Even looking at a single character can look too noisy, so we tried to [simplify details] before you see characters together in a party. Characters must read nicely on an individual basis,” said Berman.
Sawyer had even made the controversial decision to drop the player’s party count from six characters to a total of five toward the goal of cleaning up visual noise. “That was based on a variety of concerns,” Sawyer explained. “The concern was not, 'Six party members is too many,' but there was a larger concern of keeping track of everyone. The clarity of combat could be very hard to follow. There are a number of things we did to make the pacing of combat better, to make it easier for you to use all of your party members. One of those decisions was to go from six to five party members.”
Berman hit on other ways to diversify Eora’s races without cluttering up the screen or individual character models. “We redid cloth physics for a lot of capes, but that doesn't really change anything. We really polished things, the way they look and function. We added scabbards to characters and other small stuff like that.”
Deadfire expanded the first game’s set of cloth-based outfits from around nine to over thirty. While every character still resembled a humanoid, changes such as modifying character proportions and generating more styles of clothing and items further differentiated races. “They use completely different templates. If you cycle through all the sub-races in Deadfire, they shouldn't look anything alike, and you shouldn't confuse them visually when you're looking at them up close.”
Every upgrade and experiment performed on Deadfire’s characters amounted to valuable learning experiences for Berman and his fellow artists. “Even if we were to make a different style of game, like a first- or third-person title, we would take what we learned on Eternity I and II and apply it to that,” he said.