Devblog

Devblog 2: Node-Controlled Team AI in Blueprints & Music Design

1. Node-Controlled Team AI

For the friendly BLISTER team AI we needed officers that would follow exact orders set by the player. The amount of “thinking” they do is quite minimal since they need to follow the set plan and not deviate from it. The challenge of this is getting this to work in real-time while the insurgents are essentially fully dynamic AI.

The heart of most AIs in Unreal Engine lies within a structure called a “Behaviour Tree”. It’s not unique to UE in any way but when I say behaviour tree here I’ll be referring to UE4’s specific implementation.

The BLISTER Team AI behaviour tree

Just above is the full behaviour tree for the BLISTER officers. It looks daunting but once again it’s quite simple once you break it down into its constituent pieces. It certainly looks nicer than the material from last week. Let’s just go through it assuming that you have no idea about behaviour tree functionality itself but some reasonable knowledge of blueprints and programming principles.

As always, I’m not any kind of brilliant blueprints/AI master so if you have any ideas, please tell me how you could improve this behaviour tree yourself.

The behaviour tree basically tries to find a path until it reaches a leaf (A node with no children) and will then run that. There are exceptions and we’ll see those exceptions as we go along.

First, we start at the root before moving to the “Simple Parallel”. Ignore the big green thing near the top of the image for now, that’s a BTS (Blueprint Task Service) and we’ll talk about it in a moment.

The Simple Parallel just runs the leaf connected on the left, while at the same time running whatever it’s connected to on the right. Inside the “attack” node is a blueprint (just like every other purple node) with some code within.

In this case the Simple Parallel takes the target actor, and if it’s able, it will try to shoot at the target. Since we want the agents to be able to shoot at almost any time this is always running along with whatever task they are doing.

There are also situations where the officers cannot shoot: if an officer is opening a door at a door node, the attack node is still running in parallel but is 'blocked' by the code that is written within the node itself. 

The Blueprint Task Service for the friendly AI officers

The Blueprint Task Service for the friendly AI officers

Now let’s talk about the big green thing: the BTS.

This special type of node is executed when the execution path hits the node it is attached to. The BTS runs every tick of the behaviour tree until the officer dies. This service is the vision of the officer and it fills out all of the variables of the behaviour tree which all of the leaf nodes use.

How this whole blueprint works could be (and might be!) its own blog post so basically what you need to know is that this handles the officer’s cone of vision, checking what it can see and whether the person it sees is an enemy or not etc.

Here the most important part of the behaviour tree:

The first node in the image above is the Selector.

The Selector goes through each possible path from left to right and restarts when one of the leaves return “successful”. 

Say it goes down the first path on the left: it’ll hit this “sequence” node with this blue part attached. The blue part is a blackboard decorator. Don’t ask me why it’s called that. 

What this blackboard decorator does is take a variable, in this case the current task (an enum), and checks if it’s equal to “MoveToNextWaypoint”. If it is then the sequence can execute; otherwise it returns failed and the Selector will move onto the next path. 

If it is successful then we get to the next blueprint that handles making the agent move to the next waypoint. If this returns successful (which happens when the officers reach their destination) then it’ll move onto the next node, because a sequence within the behaviour tree will run through all of its paths until one fails. 

If the movement fails, the next leaf node isn’t executed and we return to the sequence from before and start again. But let’s say the movement is successful: we then reach the node “NextTask”. This is the second node of every path from the sequence (except FollowPlayer, but that’s special and won't be used during a breach). The "NextTask" blueprint finds the next task in the waypoint that the agent is currently attached to, and if there is none it will advance to the next waypoint. The whole tree then starts again.

So, the enum “task” controls the behaviour of the officer, and task is set by waypoints placed by the player while they’re in the planning phase before a breach. The point is that you get make plans using a sick drone and the player never sees this behaviour tree, instead the officers just appear to understand the strict instructions the player sets for them.

In summation, at a high level the behaviour tree is simply fed a task enumeration, and based on that enumeration it chooses a path that ends in a leaf and those leaves each contain specific code that affects the officer's behaviour directly.

There is plenty of work to do to improve the system but hopefully you learnt something from this explanation of how team AI works in BLISTER. 

Plotting the node points in action using the drone.

*A note: team AI officers in the blueprints are called 'agents', in case the difference between what is written and what is within the blueprints was confusing you. 

2. Music Design

Since the FPS rig isn’t completely set up yet, I thought I’d talk about something less related to game development but still important to the overall experience of a game: music.

It’s probably fallacious to say that people often overlook the music of a game because in reality it’s one of the first things people seem to talk about. There are plenty of games where music is not particularly important (Arma and other realistic simulators) but the vast majority of games have a decent score, even the little ones.

BLISTER’s music is not particularly developed right now, but this is one of the main areas I am personally working on and we’ve got some embedded examples of previous prototype music that we’ve created. I want to talk about that before discussing what’s next for the music of BLISTER.

We decided to make all the music and much of the important audio in-house, so while we’re in development a lot of the sound design remains unfinished but in the end we’ll have a product that has its own unique sound signature. Chief to this is the music, which is more than just filler: the music of BLISTER is designed to create tension and then resolve it in a ‘drop’ at the moment the player kicks down the breach door and lets all hell loose.

We’re all metalheads at Item_42 so our aim is to blend that cool, tactical, epic, electronic score endemic to action games and movies like GRAW and Mission Impossible with a heavy, djent-like drop at the breach point.

The tension in BLISTER is less straightforwardly consistent than something like Rainbow Six, so our music is designed to clearly demarcate when the player is walking around a creepy building, setting up a plan with a drone in a building crawling with insurgents and executing that plan in a blister of bullets.

Below is the first example of the music I created with this plan / breach music style in mind. It’s somewhat primitive and straightforward but you get the idea.

 

This is the second example I created, very similar to the first. You can hear how the tension builds and climaxes into the heavy riff. Again, it’s not fantastic but it gets the job done.

 

The final example below is the music created by Joe Campbell-Murray whom we have enlisted to help us create more music (we’re also in a mad band together called ZILF, check us out). It has less of a production punch than the other songs but this is the direction in which the music will be taken, but with a more electronic vibe and better production. The chaotic drop is, to my ears, absolutely perfect for some zany FPS action.

 

When I can mix and master at my own workstation I find it easier to get the production results I want, so much of the music will be mixed/mastered here to avoid some of the production issues in the last example.

Many game studios and indies outsource their music development and that is a sensible thing to do. It saves time and there are very talented people out there who much better than me at this sort of thing. However, we have a very specific goal in mind for BLISTER, we are musicians and we know exactly what we need.

When the demo drops on Steam hopefully before Christmas, the basic ‘final’ music will have been created and it will add to the gameplay experience. BLISTER is all about being tactical, making clever plans and being immersed in an action movie that you have personally directed, so music is an integral part of that experience.

With that in mind, here are the specifications for our final demo-ready music:

1. Hallway ambience - Each breach wing (where the action takes place) is surrounded by corridors where the story for the game takes place. This music just needs to sound ambient and dark. Length: 4 minutes

2. Setting up at a breach door - This is where the player throws a drone through a vent and starts setting up a plan in a specific wing of the level. Here, the music needs to pick up the pace and sound tactic00l. Length: 4 minutes

3. Moment of breach - This is where the player kicks the door in and all hell breaks loose. It needs to be aggressive, adrenaline-pumping and heavy as hell. Length: 30 seconds

4. During the breach - This music plays after the 'moment of breach' music, while the player still has most of the insurgents to mop up. It should remain energetic but less full-on. Length: 4 minutes

5. Wing cleared - This is when every insurgent has been mopped up and the hostages have been taken care of. It should sound victorious. It’s essentially to notify the player that they can wind down and that they’ve done a great job. Length: 5/10 seconds

When this music is ready to be shown off, I'll probably write another devblog on it and how it also technically works in-game. Next week we'll talk about the FPS rig and some more blueprint tutorials. Until then, cheers!

Regan & Bret

Devblog 1: Weapon Materials & Greyboxing

1. Weapon Materials

In BLISTER we want to have a huge array of guns. There’s quite a few already (21 guns!) and we began to notice that the way we handled the materials for the guns was starting to take up a lot of storage space. They had very high resolution textures, with one, absolute material made up of a bunch of really high resolution textures.

In-game this looks fine, but it’s really inefficient storage-wise as well as taking up a lot of space in the texture pool, and no only because we were stupid enough to not pack the reflection maps. More than this however, we wanted to more easily control the look of the guns iteratively.

With the old system, we’d have to open up the material again in Quixel, make the changes we wanted and then re-export and re-import in order to preview them in-engine. This was time consuming as shit.

Figure 1 The old material "system" aka: Bollocks

Figure 1 The old material "system" aka: Bollocks

We decided to go about creating a new system for texturing our weapons, using the KRISS Vector as a testbed for this. The aim was to have a material system that combines baked normal and ambient occlusion with tiling materials.

The normal and AO are applied to the model as normal, but the rest of the texture is from detailed tiling materials that we have many exposed parameters for. It can’t just be a single tiled material though since we’d have no interesting detail like wear on edge and in cavities, so the system must use two tiling materials with one masked out to appear only on places that wear is appropriate.

Here’s the completed master material we ended up with, as always with materials it’s far less complicated than it appears.

 Figure 2 New material system, definitely still needs improvement

 Figure 2 New material system, definitely still needs improvement

Let's break it down:

First thing you notice is there’s a lot of parameters, both scalar and 2D. This block is for the main tiling material, for a gun it’s usually some kind of painted metal or plastic, so we have the albedo in its own texture “MainMaterialAlbedo”.

This is then multiplied by “MainMaterialColour” which is set in the material instance so we can alter the colour of the main tiling material. The TexCoord/Multiply is for setting the scale of the main material. The “MainMaterialRM” (RM stands for Roughness/Metallic) is the two maps within the red and green channels respectively.

These both use a Blend_Screen node to control the brightness, and the roughness uses a CheapContrast to be able to change the look of the material in how it reacts to light. So we can choose to make it less glossy, or have more contrast between the glossy and the matte parts for example.

All this is put into a MaterialAttributes which almost allows you to make a material within a material--this is useful for when we blend it all together later.

You can see the wear material is far less complex, but basically does the same stuff. We don’t really need a contrast for the roughness in this case just because the wear is never really visible enough for it to make a difference. Notice that nothing has been masked out yet, that’ll happen when we blend the materials together.

Here is where we blend the tiling materials together. The Base Material input is the MaterialAttributes of the Main Material, and the Top Material is the Wear Material’s MaterialAttributes.

The alpha is a mask we create in Quixel specific to the weapon, so the wear is only applied to the areas we want it to be. We also use a contrast node and a multiply node so that we can have some in-engine control over the strength of the mask.

Notice that we then have to break the result blended material so that we can then add in the normal map and ambient occlusion, since we blend those separately as to not lose any detail, you’ll see in the next section how we do that.

This one is a bit crazy, and the part that gave me the most headache. I’m sure there’s a better way but every other solution I found gave me different troubles (Including blend angle corrected normals). So here we have the normal map from the main tiling material, and the normal from the wear tiling material.

The problem is that if we just blend them together we lose a bunch of detail from mashing the blue channels together. So to strip out the blue channel of the wear material, we make a vector using the append nodes, (1,1,0). Multiplying the texture map by this basically clears the blue channel and then when we lerp between the two maps the blue channel just stays as it is.

The mask we use for the alpha is exactly the same as the one we use for the blending of the two materials. Now that we’ve got the tiling material normal blended together, we need to do the same thing for the mesh normal so we strip the blue channel of the normals we just blended together as before, and then just add the mesh normal and the blended normals together. This makes our final normal map that’s then plugged into the final material output.

There’s many ways of making this system, and loads of stuff that I’m sure could be done better, and features I’d like to add (Like more tiling materials in a single material for better variation) but so far it’s treated us very well and given us some really nice quick iteration and control when texturing our weapons.

Most of the weapons have about 2-4 material instances of this master material, for example one for plastic, one for metal, different colours etc. It also allows us to really quickly create new skins with almost no effort. Here you can see the final material for the KRISS Vector, using 4 material instances:

KRISS using 4 material instances

KRISS using 4 material instances

FN FAL using the same system with a new material, varnished wood

FN FAL using the same system with a new material, varnished wood

2. Blocking Out the Demo Level

Our intention is to drop the demo level on Steam before Christmas so that people can try it out, tell us what they like and what they don’t like. It will be the first time BLISTER is properly in the hands of people to play (and break), so it’s important for me in both a level design and environment design capacity to make the first level as fun and graphically arresting as possible.

It’s important to note that the first demo level is actually only 1/3rd of the entire level when it finally goes gold. This is because these levels are huge and we want to get as much feedback as possible before designing the rest of the level. Each level in BLISTER will have 3 separate wings to develop and carry out a plan in. The demo level contains 1 wing, but each wing is the size of an ordinary level in Rainbow Six: Athena Shield so if you’ve played a decent RS game before you’ll get an idea of just how big a finished level in our game will be.

BLISTER is heavy on the English Civil War references, so I thought it fitting to name our first demo level ‘Marston Moor Power Plant’, after the Battle of Marston Moor pitting King Charles’ forces against Oliver Cromwell’s.

Instead of marauding armies, you command 3 well-trained, heavily-armed specialist firearms officers versus a modern English terror cell hellbent on capturing critical infrastructure from the State. Your job is to eliminate the insurgent forces from within the power plant and rescue as many hostages as possible.

Before I entered UE4 to begin the greyboxing phase I planned out the level on paper and then recreated that plan in 3dsmax. This allowed me to visualise the flow of the level before I began constructing it. Planning it all out properly first meant the greybox phase only took a day to complete.

blister_leveldesign.gif

Before blocking out the level proper I had to set out a sizing standard for all shared properties:

- Walls: 4m height / 20cm depth with a 1m gap for piping, venting and drone access
- Doors: 2m height / 1m width to give AI enough room to manoeuvre
- Drone vents: 50cm height / 1m width to give the player’s drone an Elite Dangerous/letterbox-style opening

Corridor width, window height and so on is variegated, but the minimum allowed corridor size for AI to properly manoeuvre according to a drone plan is at least 2m width. If I wanted, I could build a couple of 1m-width floor-level vent access tunnels that only the player could fit through, either for secrets or gameplay decisions.

I had two considerations while developing both the 3D plan and the greyboxed level: to make each segment of the wing area look and play differently to its neighbour, and to make sure each room in the wing has at least two access points.

The result is an arrangement that a power plant designer might baulk at, but there is a lot of satisfaction in the varied geometry and navigational flow of each room to the next. It might be hard to visualise at the greybox phase but the screenshots below demonstrate the variation in the level design.

A turbine hall

A turbine hall

A pipe flow area

A pipe flow area

A spent fuel pool

A spent fuel pool

A control centre overlooking the waste fuel operation

A control centre overlooking the waste fuel operation

During the greybox phase we tested the placement of doors, cover and obstacles to see what worked and what got in the way of gameplay.

The greybox is finished so now begins the tedious task of whiteboxing--that is, replacing the sparse BSP geometry with actual meshes that represent the final level without having to think about texturing just yet.

Next week’s devblog will probably discuss some changes to our FPS rig and an explanation of how our drone system can save time by loading in previous plans and tweaking them.

Cheers!

Regan & Bret
Item_42