Kenneth Chen

Kenneth Chen

Digital Media PhD Student

Kenneth is a doctoral student in the Digital Media PhD program. In his series “Kenneth on Games” he writes about his passion for games and game design.

The nature of procedural level generation makes it hard for designers to implement a smooth difficulty curve. Instead, each level is treated as a static plateau of difficulty, with large jumps from level to level. With old-fashioned hand-placed levels, designers can create scenarios that have a much smoother flow of difficulty, even within the same level. But Dead Cells manages to get the best of both worlds: randomly generated levels with self-contained difficulty curves.

WARNING: Spoilers for late-game enemy types.

Discrete vs Continuous

In the realm of game development, the terms “discrete” and “continuous” come from programming, but they can be useful for design as well. A discrete value comes from a set of distinct possibilities, whereas a continuous value is not predefined and can be anything from within a certain range. In the context of design, I will use the terms “discrete difficulty” and “continuous difficulty.” Rather than thinking in terms of values, these definitions will lean more towards the player experience, with discrete difficulty representing large noticeable shifts and continuous difficulty being more of a gradual curve.

Discrete and continuous difficulty can come from a variety of sources which often overlap to create the whole experience. When you are facing a normal enemy or an elite version of that enemy, it is discrete difficulty. But the amount of health that you have going into the encounter is continuous: the same battle can be more difficult with lower health. What about the number of enemies, or the environment that you are fighting in? All of these considerations come together to form the end result.

View post on imgur.com

Level design itself can be a form of continuous difficulty. Take, for example, the Lothric Knights in the High Wall of Lothric in Dark Souls 3. The first time you meet one, it’s framed as a difficult encounter because another enemy lurking in the background will join him against you during the battle. For this reason, the game gives you a bonfire right afterwards. Later, you meet another Lothric Knight in an isolated room, where you can get a sense for how he moves and when to strike. Towards the end of the area, you must pass through an area of three Lothric Knights, which is very difficult if you haven’t learned the tricks to deal with them yet. Finally, there is a lone red eye knight who you can choose to fight.

Through enemy placement, the designers can create a whole difficulty curve from start to end within a single area. The beginning is a harsh greeting, the middle is a training period, and the end is a final test of your skills. Level designers spend their entire careers learning how to craft such meticulously balanced encounters. How could a procedural algorithm create a comparable experience when it’s difficult enough for a human to do it?

Although procedural generation techniques might never reach the quality bar of Dark Souls level design, they can still come close. It’s just that most of the time, they don’t try. Most roguelike games use discrete difficulty through separate levels. In Crawl, this is made obvious by the fact that monsters only level up in between levels. Binding of Isaac and Crypt of the Necrodancer use completely different sets of enemies as you progress through each new area, or upgraded versions of previous enemies.

But within a single level, there is often no real implementation of a continuous difficulty curve. The beginning of the level is objectively as difficult as the middle or the end. Instead, continuous difficulty comes from resource management or other elements of design outside of level design. There is nothing wrong with this approach, but it leaves behind a potential avenue of design.

One of the best implementations of continuous difficulty in level design for a roguelike is in Invisible Inc, where a timer will ramp up the threat level and introduce new hazards within a mission. But this system is very explicit and conveys a lot of information to the player. This is not a terrible thing, but it’s different from the subtle, player-driven, exploratory style of Soulsian level design.

So now that we have discussed the differences between discrete difficulty and continuous difficulty, what does that tell us about Dead Cells? How does Dead Cells reach for continuous difficulty through procedurally generated level design?

Enemy Tier Lists

In several of the areas in Dead Cells, you can notice that stronger enemy types start to spawn after the second half. For the Fog Fjord, it’s the pirate grenadier. For the Ossuary, it’s the vat that spawns slimes. For the Promenade, it’s the shielding voodoo doll. Not all the areas seem to be set up this way, which is a shame because I think it works quite well.

It’s simple enough to write a new rule in the algorithm that says, “after the halfway point, start spawning in this new enemy type.” That would just create two discrete blocks of difficulty within the level. Granted, two is better than one, but Dead Cells goes further.

When a difficult enemy is introduced, the player needs to be eased into the experience. Dark Souls 3 threw a cheap trick at the player during the first Lothric Knight encounter, but later it gave you an opportunity to fight one-on-one before throwing more of them into the mix. The first time you run into a pirate grenadier, they should be alone, and they should slowly become more and more common over the course of the level.

But this doesn’t just happen with pirate grenadiers. It also happens with the axe throwers in the fog fjord, which are mid-tier enemies. The axe throwers appear from the very start by themselves, but soon they are accompanied by groups of weaker zombies, and eventually they form groups of their own, until you reach the end and they are often seen together alongside pirate grenadiers. There isn’t just a single “halfway point enemy” that gets explicitly defined for each area.

There are so many possible configurations of enemies that you can encounter. Single zombies, zombies in small groups, single axe throwers, axe throwers with zombies, multiple axe throwers, zombies in large groups, pirate grenadiers, pirate grenadiers with zombies, axe throwers with many zombies… it just keeps going. The variety of configurations is what enables continuous level design difficulty.

It would be tempting to categorize each area’s mobs as weak enemies, medium enemies, and strong enemies, where strong enemies only start appearing after the halfway point. But this doesn’t work either. What about the mushrooms in the Old Sewers which are strong but start appearing right from the start? Or the grapplers in the graveyard who appear around a third of the way in, while fog channelers start appearing around two thirds of the way in? Discrete categories like tier lists or enemy categories won’t work anymore: we need to think in terms of continuous difficulty.

Building the Algorithm

View post on imgur.com

I don’t know how Motion Twin built their algorithm, but here’s how I would build it. My implementation could be wildly different but it should achieve the same core goal of continuous difficulty. I am also switching to the term “area” to describe sections like the Fog Fjord or the Prisoner’s Cells, because even though they would normally be called levels, the term “level” here will specifically refer to a value for a variable in the code.

First, before adding any of the enemies, the area geometry is built and every point in the area is given a “progression percentage” for how far along it is, with the beginning being 0% and the end being 100%. This is made drastically easier when areas are linear, which I suspect is the reason why Dead Cells has mostly left-to-right straight path areas, as opposed to the winding mazelike areas in Binding of Isaac where it’s difficult to know how far along you are in any given area.

Next, give every area numerical values for “minimum difficulty level” and “maximum difficulty level.” These variables will become meaningful later so it doesn’t matter how many digits of precision they have yet, but for the sake of argument let’s take the Fog Fjord and say that the minimum is 10 and the maximum is 50. This means that the end of the Fog Fjord should be five times more difficult than the beginning, which sounds like a lot. But when you consider that you go from fighting groups of zombies to groups of pirate grenadiers, I think a magnitude of five is a good estimate for this differential.

Outside of the areas themselves, each enemy type would have an assigned difficulty value. These values are relative to the difficulty levels from the previous step. Let’s throw out some numbers and say that zombies are 3, axe throwers are 10, and pirate grenadiers are 25.

Bringing it all together, in any place where enemies should be spawned, the game manager would look at the progression percentage, the minimum difficulty level, and the maximum difficulty level, and using those values determine the current difficulty level at that specific position. The halfway point (50%) would be 50% of the value between 10 and 50, which is 30.

The game would then spawn a group of enemies whose individual difficulty values roughly add up to the current difficulty level. If you are at the beginning of the level, the game can either spawn a single axe thrower, or a group of three zombies. Travel a bit further into the area, and the current difficulty level might increase to 12, which could mean four zombies or one axe thrower and one zombie.

Fast forward to the end of the area with a difficulty level of 50. This could be a pirate grenadier, two axe throwers, and a single zombie. If you can make it to the end of the Fog Fjord, you’ll notice that this becomes a common enemy formation. A designer could fine-tune the area by adjusting the min and max values as well as the individual difficulty values for specific enemies.

Finally, there would need to be certain special-case rules. For example, you would add a maximum limit on the number of enemies that can be spawned, so that you don’t get sixteen zombies in a giant cluster at the end. Other enemies like fog channelers might need to be programmed as multipliers rather than additive values (a fog channeler in an enemy formation multiplies the final value by, say, 1.5).

With all that complete, we would have a system that works fairly similar to the Dead Cells enemy generation algorithm. Whether or not this is their actual process, we may never know.

This is also why I’m concerned about the new elemental effects that were recently added. Are they considered as part of the algorithm? Will the game say “spawn a group of enemies that are stronger than what they should be at this point in the game, but put them in a puddle of water so they can be electrocuted”? It seems strange to me that something that is potentially game-breaking would be added as a post-launch feature. Granted, Dead Cells is still early access, but I think that the relationship between elemental effects and enemy placement poses some important technical considerations.

But so far, Dead Cells has been a great ride with a smooth flow. Enemy placement is a very subtle aspect of game design, and yet it has helped propel the Dark Souls series to stardom. People may not notice it consciously, but they will notice it subconsciously. In a game with such a polished and beautiful surface presentation, it’s great to see that the core is just as well-designed.