Hyper Light Breaker cover
Hyper Light Breaker screenshot
Genre: Adventure, Indie

Hyper Light Breaker

Procedural Generation + HyperDec

HyperDec - Intro





Originally, before it was called HyperDec, the procedural “decking” system was built out to be able to evaluate the height of terrain at a given XY position & procedurally populate those spaces with props, using seed-informed deterministic random value selections for things like position, rotation, and scale, as well as parametric variation for things like space between props, maximum count, height and slope ranges, spawn area, etc.

From there, we wanted to explore applying artistic intentionality with props/clusters of props, being able to define “child spawns” that would anchor themselves around spawned instances. Pieces had filters for what kinds of surfaces they could and couldn’t spawn on, as well as custom avoidance filters and world-aligned texture masks, so users could parameterize relational behaviors between types of props, all of which were piped into a global HISM array.





After proving out simply laying out these pieces & giving them relational considerations, we moved onto zone targeting. In addition to randomized terrain on each run (more on terrain from Len) we wanted to have distinctive zones with unique props in each. Thanks to some very clever programming from Peter Hastings, Senior Gameplay Engineer, we were able to very efficiently read zone data encoded into world-aligned textures, and filter placement accordingly.





Artists and designers could create Data-Only-Blueprint assets that would contain primary and secondary assets to spawn, and their parameters for placement on the terrain. This workflow of randomized terrain with zone identifications became the foundation of our procedural decking paradigm.

Initially, this paradigm worked out well. But over time, we ran into issues when trying to implement at scale.

A Setback



The implementation we had started to run into issues as it continued to grow. Rather than only placing static props using this system, we began utilizing it for placement of gameplay objects, applying more robust filtering for things like flatness detection, and our evaluation of terrain was happening at runtime per-prop, with prop counts getting up into the 70K - 100K range, which meant that the startup time for each run took longer and longer.

We also ran into issues with balancing density & variation with replication for multiplayer; all of these tens of thousands of objects needed to consistently show up on every player’s instance. Having all procedural placement done on the server and then passing that enormous amount of data to players on begin play was unfeasible, and so instead we would only have the server spawn gameplay relevant pieces, and then each connected client would receive a seed number from the server to feed into the client-side placement of props. Utilizing the same seed across all clients meant that even though they were spawning objects locally, they would all spawn with the same transforms informed by the seed.

While we were able to achieve a satisfying amount of variation and distinction, it became clear that the increasing generation time wouldn’t be sustainable long-term.

Rethinking Our Design Paradigm



Tech Art & Engineering sat down and re-thought our design paradigm for procedurally generated content in the game, and wound up completely re-working our implementation from the ground up.

We were able to move away from a solely-blueprint-driven pipeline for procedural decking, leveraging faster C++ execution, thanks to some awesome effort put in by Justin Beales, Senior Gameplay Engineer. We also moved the per-prop terrain evaluation from runtime to design-time. This allowed us to pre-determine placement of objects and then feed very simple data into a runtime system that grabbed the objects and their intended locations and place them accordingly. Each stage’s variants would have coinciding data to reference, and using a DataTable to layout objects & parameters, we could “pre-bake” candidate points for each object type in the editor, and then save that data for quick reference on begin play. So while there are a limited number of variants as a whole, the selection of candidate points from the list could be randomized with a seed, meaning that the same variant could have unique prop/gameplay layouts every time.



Now that we had generation in a better spot, we set out to expand on the artistic intentionality of the pieces being spawned. It became clear over time that the use of anchor-clustering & avoidance distances would not be enough to make these levels look less like math in action and more like art. This idea and conversation led to the creation of HyperFabs, which are spawned just like regular props via HyperDec, but have some more advanced logic & artistic implications.

HyperFabs



HyperFabs take the concept of Prefabs (prop or object arrangements saved as a new object for re-use) and add some additional utility & proceduralism to them.

The overall idea is that artists can lay out arrangements/mesh assemblies, that are intended to represent a small part of what would normally be a hand-decorated level. They then can use a custom script we’ve built to store those meshes in a generated Blueprint asset, that can then be placed on the terrain. The center point of the actor will align to the terrain, but then based on rules exposed that artists can tweak and assign to components/groups of components using Tags, the individual pieces in the HyperFabs will also conform to the terrain surrounding the actor’s center point in the world. It takes our original idea of relational spawning, but allows artists to lay out these relations through traditional level design tools instead of strictly through DataTable parameters.


A boulder assembly turned into a HyperFab, made by Will in Enviro

It doesnt have to just be for small arrangements though; entire city blocks have been baked into a HyperFab, which conforms to varying terrain as expected.


A city block assembly turned into a HyperFab, made by Wolf in Enviro

The script for baking HyperFabs from mesh assemblies is smart enough to know when to use static mesh components versus mesh instancing, and it also has a utility to merge stacked/co-dependent objects into new static mesh assets, which helps with performance & automation.

Other cool bits



Shoreline Generation



A neato bit of tech I worked on before we used terrain variants was shoreline generation. Since terrain was being generated using a voxel system, each playthrough generated terrain that was completely random. (But also much harder to control/make look nice like our new approach!) This meant that we couldn’t pre-determine shoreline placement, either through splines, decals, or shader stuff.

After a bit of research, I learned about Jump Flooding, which is an algorithm that can generate a distance texture between bits of data in a texture in UV space. In the case of shorelines, I captured an intersection range of the terrain, and used that as a base-mask. That mask was then jump-flooded to give us a gradient, which could be fed into the UV channel of a basic waves-mask texture that ran perpendicular to the wave lines direction. Using some additional time math and noise modulation, waves could pan along that distance gradient, with shape and body breakup, controls for distance-from-shore, wave count, and initial capture depth.



Flatness Detection



Another challenge we ran into for procedural placement was flatness-range detection; some objects had platform-like bases that needed an acceptable range of flatness so that they weren’t perched awkwardly on the side of a cliff or floating on little bumps in the terrain. The first iteration for flatness detection used traces from randomly selected points in a grid formation, comparing the height offset averages, allowing for a variable number of failure tolerances and grid resolution, before determining if a point was flat enough.



While this approach did find flat areas, it was costly & prone to prolonged searching resulting in a block in the generation process while platforms found their place. After we moved candidate point determination to design time, we reworked the search function to use the terrain point data in a similar grid-check fashion, using grid space partioning to speed up the referencing of bulk points, which led to this fun little clip of the proof-of concept, showing an object finding approximate-nearest-neighbors with no collision/overlap checks, just location data.



While this did divert the computational cost of determining flatness over distance from runtime to design time, it was still very slow and presented a blocker to design & environment when pre-baking asset candidate points. After a bit of research, jump-flooding came to the rescue again.

The workflow for flatness-range detection works in a series of steps. First you get a normal-map of the terrain you’re evaluating, and mask it by slope, with anything being below a configurable slope value being flat, and anything above it being too steep.


White areas are flat, black areas are too steep or below shoreline height

We then invert this output to provide a sort of “cavity mask” of areas that were flat enough for placement. But we needed to be able to define how far from the nearest non-flat area a point was, so that we didn’t pick a point that was flat enough at that point, but not flat over the range that equaled the size/footprint of the object we were searching for. To solve for this, we jumpflood that slope/cavity mask, and then transpose the 0-1 values represented in the output-textures’ UV space into the world-space equivalent, based on the size of the terrain. This gave us a distance mask that we could then threshold, returning us to the yes-or-no mask configuration that could be read at each point-evaluation.




Because all of these steps are running with shader calculations instead of collision queries or trace loops, the time to find flat-range points for assets decreased so much that the generation time is nearly indistinguishable when baking points with and without flatness checks. Yay shaders! Here are some fun gifs of the values for distance & slope being changed when creating a flatness mask.



Breaker Terrain Generation Basics



The Hyperdec terrain process generates the foundational landscapes upon which all other art and gameplay assets are placed. The ideal for a rogue-like would be that every run of every stage is completely unique in decking AND landscape. However, pretty early on we ruled out completely procedural landscape generation simply because of the R&D time it would have entailed. We also had the notion that our gameplay would require some very tight control over the kinds of hills, valleys, waterways, and other landscape features that emerged. In a fully procedural landscape system, running in seconds as a player waited, we might get landscapes that just plain broke gameplay at times; this was unacceptable. So we went with semi-procedural.

Our approach is to generate a large, though not infinite, collection of terrain meshes off-line that, when combined with our highly randomized Hyperdecking system, can give the impression of limitless, fresh gameplay spaces every time you play. Initially we explored voxel-based terrain, since it was an artist-friendly way to quickly build interesting terrain shapes. This was eventually abandoned as the run-time implications of voxels were an unknown and we didn’t have the R&D time available to ensure their success.

Work continued with algorithmic island generation spearheaded by Peter Hastings. Many of the features present in this early work exist in our current terrain as well.


Procedural Island Generation, Peter Hastings

At some point it was clear that iteration and experimentation would put serious strains on the purely algorithmic approach. This led to adopting the procedural tool Houdini as the master terrain generator. This was especially useful since we could directly translate all the algorithmic work into Houdini and then further refine the topology in later parts of the Houdini network. First algorithms were directly re-written in python and then later in Houdini’s native Vex language for speed. Further, Houdini is effective at generating lots of procedural variations once a network is generating solid results on a smaller scale. Our goal is to have at least 100 variations of each stage to draw from during play and using Houdini allows a single network to drive all variations.


A bird’s eye view of a Houdini sub-network generating a component of the terrain


One of the current terrain variants for one stage, without any hyperdecking

For many of our stages each terrain is effectively an island that’s composed of sub-islands which are each assigned a “Zone”. A Zone is basically like a biome in that it is intended to have a look and feel clearly distinct from other zones. They are intended to look good but also help the player navigate and get their bearings as they move around the play space. In order to provide these features in every terrain variant a combination of noise fields and specific scattering of necessary topological features occurs in the Houdini network. Each stage has a different take on this basic formula and R&D is ongoing on how to get more compelling, interesting caves, hills, nooks and crannies without creating game-breaking problems (like inescapable pits, for example).


Visualizing a walk through the Houdini processing chain that converts a circle into terrain.

The animated image above shows one processing chain that starts with a basic circle geometry delineating the overall footprint of the island then, via a chain of surface operators, eventually ends up as playable terrain. Many of the operations involve random noise that contributes to the differences between variations. Both Houdini height fields (2D volumes) and mesh operators are employed at different points to achieve different results. The initial circle is distorted then fractured to yield the basis of a controllable number of separate sub-islands. Signed distance fields are calculated from the water’s edge (z=0) to produce the underwater-to-beach transition slopes. More specific mesa-type shapes are scatter-projected into the height field to yield controllable topology that plays well compared to purely noise-generated displacements. In the final section, geometry is projected at the boundary area into the height field as a mask, distorted via noise fields and displaced to create the stage’s outer perimeter. The full chain of operations can generate a large number of unique terrains that all exist within constraints set out by game design.

Another feature that exploits the fact that our terrains are not pure height fields is cave tunnels and caverns. These are generated as distorted tube-like meshes that are then subtracted from a volume representation of the above mesh. We are excited to push cave-tech (tm) in the future to generate some interesting areas for discovery for the player.

Unfortunately, to produce production quality terrains the resolution of the resulting mesh needs to also increase, which is starting to slow Houdini down compared to the early days when everything processed so briskly. These are relatively large meshes which are getting converted back and forth between mesh, height field, and voxel representations to get the job done. As production moves forward and we start generating all the variants needed for gameplay the plan is to offline processing to a nightly job on a build machine so no one has to sit at their screen for hours watching the wheel spin.

Articles & Sources:



Jump Flood Process in UE4:
https://www.froyok.fr/blog/2018-11-realtime-distance-field-textures-in-unreal-engine-4/

Flatness Detection Abstract:
https://gamedev.stackexchange.com/questions/125902/is-there-an-existing-algorithm-to-find-suitable-locations-to-place-a-town-on-a-h

Grid Space Partition Process:
https://gameprogrammingpatterns.com/spatial-partition.html#:~:text=Our grid example partitioned space,contains many objects%2C it's subdivided.

Wrap Up



As you can see, our team has spent a considerable effort executing on thoughtful procedural generation in order to make the flow of game levels feel coherent and intentional.

Want more stuff about procedural generation? Len also did this talk on tech art in Solar Ash!

Let Us Hear From You!



What do you think of what you’ve seen (and heard) so far?

Are you a tech artist or aspiring to be one? How would you have tackled these issues?

Meet Melee Wretch & Leaper

Meet Melee Wretch





Character Art by John DeRiggi

Wretches are monstrous mutated soldiers.



Original Concept Art by Alx Preston

Meet Leaper





Character Art by Jack Covell

Leapers are rare prototype soldiers who have undergone body modification experiments.



(top) Original Concept Art by Alx Preston; (bottom) Final Concept Art by Isaak Ramos

Our Character Art Process + Inspirations



John DeRiggi, Lead Character Artist, shares a bit about the character art process:

Heart Machine has a history of creating vibrant, colorful worlds that often deviate from current games. True to this goal, Hyper Light Breaker characters are inspired by traditional cel animation like Miyazaki and Studio Ghibli combined with a watercolor painting approach. Hopefully you can see this in the concepts and 3D models of the Melee Wretch and Leaper enemies.

A key ingredient here is the character’s material and its reaction to light. Games can sometimes use materials included with a game engine but often a custom material is needed to achieve the game’s artistic vision. Since graphics programmers and technical artists create the code behind materials, a custom material from scratch requires their time.

Because we are still a smaller studio, our technical resources are often constrained, and we could not devote this larger chunk of custom material time on Breaker. We are therefore using a new material on the Unreal Engine Marketplace, called Stylized Rendering System. This gives us the base for our cel-shaded look in various light and shadow conditions.

Our character art team can then customize this material and create our cel-shaded, watercolor look with a combination of hand-painted and procedural textures in Substance 3D Painter. This tool allows us to paint like traditional artists in 3 dimensions on a sculpture but do so digitally in our intended style goals for Breaker. When these textures combine with the cel-shaded material properties, we are able to achieve a really fun result!

What’s up Next?



On the rigging / animation side, we’ll be sharing soon what Chris Bullock, Lead Animator, and his team worked based on these characters, all the decisions and trade-offs that had to be considered. Stay tuned!

Heart to Heart w/ Will Tate: Environment Art for Hyper Light Breaker

Alx sat down with Will Tate, Lead Environment Artist on Hyper Light Breaker, to talk about game art, careers in environment art, and more!




  • Environments will have day/night lighting cycles
  • Hyper Light Breaker became an idea before HL was done
  • Winter area was confirmed (winter areas are also Alx’s favorite kind of area)


Plus some previews of the hub, and stage 1 and 2 of the world:






Hyper Light Breaker: Gun Iteration

The craft of making games is an iterative one. You rarely get something right on the first (or even tenth!) go, and often there’s a fair amount of discussion that goes into a finalized asset that winds up in a released game. Here’s an example of how such a conversation goes.



John DeRiggi, our Lead Character Artist who modeled the gun in this video, shares a bit of his process:

What reference material were you working off of?


  • I was primarily working from Danny Moll’s awesome concepts. Danny based his concepts on Alx Preston’s initial concept sketches. I used a few references of actual guns for thinking through the forms from other angles.




What was your thought process?


  • First, I closely matched the concept from the side view of the weapon first.
  • Then, I created appropriate depth and bevels to each part as needed for a believable representation of this concept in 3D, matching the style choices created by the concept artist.

Tell us more about the iterations and considerations here.


  • These guns are currently in a blockout state (rough) to primarily create the intended silhouette and large forms. Doing this blockout first for any assets allows us to iterate faster and earlier in case art direction needs to change after seeing the initial forms in 3D. This avoids loads of lost time if an asset needs to be scrapped. It also allows us to make the big decisions first, and therefore save time reworking a completed production asset with refined forms, details, textures, materials, etc.

Let us know what you think of this weapon and our dev process!

Heart to Heart w/Kim Belair: Narrative for Hyper Light Breaker

We had the pleasure of being able to interview Kim Belair, Co-Founder + CEO of Sweet Baby about her agency’s work on Hyper Light Breaker for the past year, narrative writing for games, hiring and career paths in game writing, and more!



Watch Alx Preston sit down with Kim for the latest in our Heart 2 Heart Series.

Here are some highlights from the stream, compiled by our wonderful mod Polare.

  • SweetBaby has worked on HLB for a year.
  • Story details from Hyper Light Drifter may be “pulled into” Breaker’s story, but it is not a continuation of Drifter’s story nor a sequel.
  • Easter Eggs are more than likely to appear.
  • One goal is to make the co-op and single player story experiences “seamless”.
  • "Empathy” is a word to help convey HLB’s tone.
  • Not traveling to HLD locations, “different story, different land”.
  • HLB will likely not have voice acting.


PLUS check out some exciting new game art drops!





And we’re back...



This time around we didn’t wait such a long time to drop more info on a new game.

With Solar Ash we were trying something new; working with a publisher for the first time, transitioning from 2D to 3D, and mostly working in a vacuum. In the end, we released a project we’re incredibly proud of, but the process deviated heavily from how we did things with Hyper Light Drifter, and ultimately, how we want to do things in the future.

We’ve decided, following the (we hope) exciting announcement of Hyper Light Breaker, to return to a more community-focused development process. We will also continue be a multi-project studio, so keep a look out for more new stuff from us in the not-too-distant future (*eyes dart around, looks over shoulder, gazes at the ground blushing and giggling*).

Our aim going forward will be to continually engage with our wonderful players through regular blog posts, content drops, streams, events, contests, and perhaps even some in-person affairs! As a team, we’re continually in awe of the creators who connect with our projects: the artists, students, musicians, developers, designers, writers, streamers, speedrunners, and more who make up our incredible and diverse community. We hope to continue to inspire and amplify your work!

As we approach Early Access for Breaker, we’d love to get you all more involved, so please send your thoughts, feedback and anything else you’d like to share with us!

Speaking of Breaker…

Is it a sequel?
Hyper Light Breaker is neither a sequel nor a prequel to Hyper Light Drifter. It is set in the universe of Hyper Light and is its own story. It will share threads, aesthetics, lore and other recognizable elements, but will be a new game driven by new designs.

Why did we decide to return to that universe?
The faded whisper of allure of this world we built transformed into a clarion call through the years. A new story became a solid vision. It felt right to come back in, root around, and share what we unearthed about this beautiful-yet-bleak world with everyone.

Is it single player?
This time, you won’t be traveling alone through our range of lush though deadly landscapes. You’ll be able to play alone or with friends online cooperatively! There will be more hidden truths about the world to uncover, an ever-shifting landscape, and tremendous rewards for exploration. Plus, you’ll see the world in 3 full dimensions. Wild.

Thank You
From everyone here at Heart Machine, past and present, thank you so much for your support of our projects over the years. We hope to continue to create beautiful, transportive experiences for everyone.

Finally, we wanted to give a big shout out to Studio Grackle who made our incredible animated trailer. We’re so privileged to have been able to partner with them!

Wishlist Hyper Light Breaker on Steam