From the onset of the project we knew that the main plot would keep the lore high level, so as to not overwhelm a player with minutia of fictional history and culture. At the same time, we also wanted to give players who wanted to dive deeper into Aveum the opportunity to do so. To do this, Michael Kirkbride (lead writer) wrote optional conversations with our main cast, as well as dialog you could overhear between them, journal entries told from Jak's point of view, in-world texts, and my favorite: the "Palathon NPCs," or Pals for short. The Pals are characters who are located in our main base (the Palathon, and later the Glaivegate) and are entirely optional vehicles to enrich the player's understanding of the world and the story.
The process of creating the Pals started with a series of brainstorming sessions between Michael, Shawn Lucas (lead level designer), Steve Devaney (writer), and myself (acting both as a producer and a narrative designer). Over a series of meetings we brainstormed the number of Pals, their identity, and the general structure of the feature. How many of them were there, and who would they be? How often would they offer new conversation topics? How many different topics and greetings would they have? What would they reveal about the world, and what would their personal bias be? Would it be possible to lose out on information if you did not talk to them every chapter? Were they solely for additional world background, or would they give you gear or side quests? All of this was hammered out in some combination of the four of us. In the end, we settled on:
After this was all decided, Michael wrote up the basics of who each character was and what they would talk about, including rough stubs of their topics for every chapter. At the same time, I made a chart to explain this flow to everyone. (Production note: I’m simplifying things here a bit. We actually decided this stuff a few different times, re-evaluating the amount and structure of content several times)
Following a thumbs up from other stakeholders, Michael wrote the scripts and it was time to get the Pals into the game.
Articy forms the backbone of our narrative pipeline. Every single bit of text is entered as a node in Articy, where it’s stored alongside its metadata and then exported into json and c++ for use in Unreal. Representing the dialog flow and its conditions in Articy, and entering each individual line of the script, was therefore the first step in bringing the Pals from page to screen. Our version of Articy is heavily modified by our amazing tools lead engineer, Karen Petersen, with each type of node carrying different information. I laid out the structure of the conversations and our narrative coordinator, Richard Heath, did the work of inputting each individual spoken line. Here’s the dialog flow for Silas Mede as seen in Articy, with each type of node labeled for explanation below.
At the very left, in green, we have the Start Nodes (A). These node are what is triggered in Unreal, but they contain no content of their own. The idea behind them is to make things as seamless as possible for level design (they only need to know and maintain one node, not an entire network), and to allow the most changes with the least disruption (the entire script could be rewritten and the whole flow completely changed, but as long as it all starts at this one node, design doesn’t need to lift a finger). The Pals each get two start nodes, one for Palathon and one for later on in Glaivegate. I could have used only Start Node one here, but since the content for the Palathon and Glaivegate is nearly wholly separate, I opted to have two and save myself from needing to add more conditional tracking.
The magenta nodes (B) denote conditional statements that the game needs to check. The very first thing I needed to check is what greeting to play. The start node narrows this down a bit (the Palathon start node can ONLY play the Palathon greeting OR the repeat hello), but we still needed to figure out whether the player was talking to the character for the first time (play the greeting) or if they'd spoken previously (play the repeat hello).
There are a number of ways that this could be done, all hinging on the interaction between Articy and Unreal. Articy has no concept of anything happening in Unreal, and Unreal only has a concept of Articy as json data. Early on, Karen and I made the choice to have Articy drive everything related to narrative and text for a number of reasons:
Therefore, everything related to narrative, including conditional content, lives in Articy. Or rather, there’s a visual representation in Articy, that is then converted via a jenkins build process into a c++ file or a uasset, which is then read by Unreal to know what to do.
All that said, the flow here is actually really simple. First, the start node is triggered by a designer. It progresses to the conditional node, which asks the system to check if the first line in Silas' greeting has already played: HasNodeExecuted(getObj(0x0100002000012FCC)). If it hasn't, the flow moves through the red pip and plays the greeting. If it has, it moves through the green pip, and plays the return hello.
The Conditional Nodes on the right of the diagram (E) control when each dialog option unlocks. In the scripts and in the original design, the different options unlocked based on mission progression: finish Chapter 4, new topic becomes available. Initially, I implemented it the same way in Articy, but that immediately ran me into several problems. One, where each chapter begins and ends in the writing scripts versus the game implementation is slightly different. Two, while mission objectives are straightforward as far as players are concerned, their setup is tricky internally. There are many objectives that are invisible to the player (for example, in Chapter 1, the player only sees one objective, but there are actually 36 separate ones), and the exact procedure that the designers use for starting/completing the objectives isn’t always consistent. Therefore, what the HUD says just isn’t a reliable way to tell if the under-the-hood tracking that I hooked into for the topic unlocks. Which meant that once I tested my implementation and a topic didn’t unlock, I couldn’t easily tell whether the problem lay in my setup or in the way the designer had scripted it.
Instead, I chose a method that’s a lot more straightforward for testing: checking that a line of dialog had played. If I heard a line of VO play in game, the topic should open up. If I heard the line but no topic unlocked there was only one place it could have failed, eliminating the need to check the logs or dig through the various setups in the level to figure out what was going on.
We’ve talked about how the game knows what to play, let’s talk about how it actually plays things. The dark purple node is a Choice Node, and tells the game that the nodes after it should appear in the conversation UI as selectable choices. They come in two flavors. The standard Choice Node (C1) requires the player to pick one of two, and has a timer (Telltale style). Once one option is picked, the player cannot return to this choice. C2 is an Exhaustible Choice, where the player is allowed to pick as many options to talk about as they wish. Once an option is chosen its dialog plays out, and then we cycle back through C3 to the exhaustible choice hub to pick another option.
The light blue (D1), light purple (D2), and gray nodes (D3) are each a type of Dialog Sequence Node. All of them have individual Dialogue Fragments inside, one per dialog spoken by each character. These are run through an automated process that generates Unreal Sequence files for each one, inside which we combine audio, automated lipsync, and hand-placed animations via Animation Blueprint events to create the acting seen in game. (We also have the option of running only audio, for rapid iteration prior to the completion of the animation sequences.)
The different colors in dialog nodes indicates their behavior in the dialog system.
Light blue nodes (D1) are regular dialog sequences: they play one after the other with no special behavior and no choice option displaying on screen. The intro and hello dialog at the beginning of the conversation are dialog sequence nodes. The dialog sequence after the dark purple choice node is what we call an “idle” node. It is dialog that plays as the player is picking what choice they want to select.
Light purple nodes (D2) are choice sequences: the text in their body will display as a choice option for the player to pick, and the dialog will play out only if the choice has been selected.
Gray nodes (D3) are dialog that plays if the player lets the choice timer run out without picking a choice.
In our chart, we see that right after the intro/hello dialog we hit a binary choice with a secret silent option. After an option from that tree plays, we hit an exhaustible choice with somewhere between 1 and 5 choices, depending on how far through the game the player has advanced. There is also a “bye” option that the player can choose to exit at any time.
Here is how it all works in game. This is at the very start of Chapter 6, so only one dialog option is unlocked here.
With the content all in Articy, the work moved into Unreal. The Pals got models, materials, tech art. We cast actors for them (Silas actually featured in a cinematic early on in the game, so his actor reprised the role) and recorded their lines. Then ran everything through automation to create the dialog sequences, timed the lines, and added acting. Meanwhile, designers placed them in the levels and ensured that they were available (or not) at appropriate times, set up the interacts, and set up the space so they weren’t just standing there doing nothing until the player walks up. Finally, it came time for an absolutely essential, but often overlooked, part of game development: testing.
Surprisingly for how complicated this whole thing is, everything mostly worked as intended. Unfortunately, there were a few caveats for the testing itself.
One: Cinematics aren’t part of our narrative system, so they can’t be used to trigger these dialog updates. Most of the chapter changes already had associated Conversations and Farcalls that I could hook into, a few didn’t, and I had to pick lines further away than I would have liked. For example, for the topic that unlocks during the epilogue, I had to hook into a Farcall in the boss fight level, which meant that if you wanted to test the epilogue topics in game you had to play through 10-15 minutes of game instead of jumping right to it. In Editor, that time would be even longer, since you had to fully load two totally separate levels. Not a huge problem, but annoying when you’ve got to do it a bunch. These sorts of problems were mitigated by creating a conversation test level, which included all of the conversations and farcalls in the game in a single low weight environment. Not a full replacement for testing in game, but it did help speed things up.
The second problem was our save system. What gets saved (or doesn’t get saved), and when, is in my opinion the least thought about, but most important, feature of a game. In our case, we only save at the start and the end of a Farcall or Conversation. 98% of the time this is enough – if you quit in the middle of a farcall, you just have to hear part of it again. The other 2% was what happened if you interrupted the farcall through a level load, such as fast traveling, and the farcall isn’t set up to resume after. Since the “I’ve finished a farcall” save never fired, the game did not remember that you’d ever heard the VO line for which was coded to unlock the topic, so the topic never unlocked. Unfortunately, this case was not discovered until late in development, and reworking how our saves function (or the layout of our levels) was not an option at that time. Instead, we extended a system we already had to lock portals while Farcalls were ongoing, and had designers manually lock any other doors that would lead to a transition during a Farcall. While it’s not an ideal player experience to wait for a Farcall to finish before you can get out of a level, it does prevent people from accidentally missing out on content (or achievements!) if they’re rushing.
Three: explaining how all of this worked actually ended up being pretty difficult, both because it’s a pretty complicated system (what unlocks where and how) and because it doesn’t work with the way we would usually test things. The majority of progression in game is set up through a system called DLDA (Dynamic Listener Dynamic Action). Something triggers an event, something else is listening for that event and responds with an action. When we make fake debug saves for testing progression throughout the game, we can artificially manipulate these DLDA events to have all of the proper progression (combats, dialogs, cinematics, puzzles) be in their proper state without needing to play the game. You can obtain a piece of gear in Chapter 4, debug forward into Chapter 10, and both your progress and the gear you found will work just fine. The Pals bypass that system entirely however, and rely on having saved specific lines of dialog in the player’s save file. So talking to them in Chapter 4 and debugging into Chapter 10 to test them like any regular content just didn’t work (since you’re not carrying over your save at all). Instead, you have to play all the way through the content for real, or use real saves created by other players.
Immortals of Aveum is billed as a high octane, cinematic first person shooter. That’s not the sort of game genre where you expect players to pause to talk to NPCs that have nothing to offer them except some great dialog and interesting worldbuilding, but we were able to execute on this feature with minimal staffing and support and have it work really dang well. Players loved them internally and now that the game has launched, I’ve also seen reviewers calling out details from these conversations in their praise of the writing, which just feels real nice. I hope that in future titles we get to do more of this sort of thing.