Creating Realistic Cloudy HDR Skyboxes

Each level of “Lillie is the Keeper” features a unique high-definition skybox texture. These Unity cubemaps are rendered in Bforartists, the UI-focused fork of Blender. Here’s how you can create your own.

Create a new scene in Bforartists, and switch the Render Engine to Cycles (camera icon: Render Engine). Add a single Directional Light, to act as the sun. Model a gently curving lenticular piece of geometry, about 5 km in radius, to act as the ground/ocean, and texture it appropriately.

If you’re interested in accuracy, you can enable an add-on called Sun Position to set your sun angle for the date, time, latitude and longitude of your game scene (Preferences menu: Add-ons: Lighting: Sun Position).

Switch to the Shading tab, and select the World (background) shader. Create a Sky Texture node. Set it to Preetham, and link its output to the World Output node’s Surface input.

Clouds

Next, model your clouds (as ordinary geometry) in rough form. Hide them from rendering. Create a Volume object (Add menu: Volume: Empty). Go to the Volume’s Modifiers tab (wrench icon), and add a Mesh to Volume modifier. Set the Modifier’s Object to your hidden cloud geometry object. What you’ll get is a Minecraft-like blocky cloud. Increase the Voxel Amount to reduce this blockiness somewhat–try 1024.

To fully remove the blockiness and get cloudier edges, go back to the Modifiers tab, and add a Volume Displace modifier. Go to the Texture tab (checkerboard icon). Create a new texture, and set its type to Clouds. Go back to your Modifiers, and assign the cloud texture to your Volume Displace modifier. Adjust the Texture and Modifier settings until you like the results.

Go to the Material tab (crash test dummy head icon), create a new material, and assign it the Principled Volume shader. Adjust the material properties to your desired cloud appearance. (This can take a while. Start with a Density of .001-.005. You’re welcome.)

Rendering

Finally, set up your camera to render the six faces of the cubemap. In the Data tab (camera icon) set the Field of View to 90 degrees. In the Output tab (printer icon) set the Resolution to a power-of-two square, like 1024 or 2048. Set the Frame Range to 0-5. Select a folder and filename to render to, setting the File Format to OpenEXR (.exr), in order to save HDR color values. Go to the Animation tab, and animate your camera to be rotated as follows on frames 0-5 (x,y,z):

  • 0: 90, 0, 270
  • 1: 90, 0, 90
  • 2: 180, 0, 0
  • 3: 0, 0, 0
  • 4: 90, 0, 0
  • 5: 90, 0, 180

Render all six frames (Render menu: Render Animation). This would be a good time to take a walk around the block. Notice the blackbirds.

Building the Skybox

When you get the renders looking how you want, open your frames in an image editor and arrange them in one big vertical strip. Frame 0 should be at the top, and 5 at the bottom. Save this, again, as an OpenEXR file.

In Unity, import the strip image. Click on the texture to see its Import Settings in the Inspector. Set the Texture Type to Default, the Texture Shape to Cube, and check Fixup Edge Seams.

Finally, create a material. Set the Shader to Skybox/Cubemap, and under Cubemap (HDR), select your texture. In your Scene, open the Lighting panel (Window menu: Rendering: Lighting). Click the Environment tab, and assign your material to Skybox Material.

Bonus

Download my .blend file for LitK level 5: Light right here.

Fix Missing Materials in Blender FBX Exports to Unity

There’s a bug in the FBX exporter for Blender 3.x (and now 4.x) that doesn’t export materials on instanced meshes. While the original mesh imports into Unity just fine (and remaps materials correctly under the importer’s Materials tab) duplicate instances of the same mesh import with only the default Lit material. If you’re (perhaps oldschool) like me, and use a lot of instancing to save disk space, this is a problem.

Here’s a Unity Editor script to fix things: FBXFixerScripts.zip

It supports Undo, regular and skinned meshes, and will walk down the hierarchy as far as it needs to. Can take a few seconds to grind through big hierarchies. Works on my ultra-complex Lighthouse prefab, and all the others I’ve thus far thrown at it. If you find a bug, let me know!

Here’s how to use it:

  1. Download & unzip the files.
  2. Add FixFBXImportMatsUI.cs to your project’s Editor folder. (If your project doesn’t currently have one, add a folder named “Editor” to your Assets folder.)
  3. Add FixFBXImportMats.cs to your project, wherever makes the most sense.
  4. After importing an FBX with missing Materials, open the Prefab or drag it into a Scene, and add a FixFBXImportMats Component to the root of the import’s hierarchy.
  5. Assign the Default Material you need replaced to the Component’s slot.
    • The default Material will vary depending on your render pipeline and import settings.
    • If using URP, it’s usually the Lit material, located in Packages: Universal RP: Runtime: Materials.
    • A quick way to find it is the select an instanced mesh in the hierarchy, and click on the Material shown in the Editor.
  6. Leave Clean Up After checked to have the Component remove itself after running.
  7. Click Fix Materials.

Unity VFX Graph Experiments

VFX Graph is Unity’s high performance pure-GPU particle effects system. I recently helped Chop Chop Games get a sizzle reel out for an upcoming card battler, which involved a fair amount of prerendered particle work in Apple Motion and Blackmagic Fusion. Looking at their competitors in the space, like Slay the Spire and personal favorite Black Book, got me tinkering again with VFX Graph.

It’s an impressive set of tools, but shader experience is definitely a prerequisite. You need to “think in GPU.” One great decision was making Animation Curves a first-class citizen in VFX Graph, allowing for a lot of nuanced per-channel animation. Getting particle motion properly punchy is always a challenge.

Shield Up uses two Animation Curves to Etch-a-Sketch the shield outline into being, then a mesh to fill the inside. The sparks are sub-emitted from invisible spawner particles.
Shield Loss uses a single, big flipbook-textured particle for the shield, with the burn line and ash lines animated with Animation Curves.
Vine Attack uses URP quad particles and particle strips with normal maps to fake some depth. It also uses a pair of useful subgraphs: Point Force and Orbit.

You can download these all in a .unitypackage here. The download also includes a couple of potentially useful subgraphs. Point Force can be positive, to push particles away from a given point, or negative, to suck them in. Orbit can also be positive or negative, to control the direction particles orbit around a center point.

Demonstration of the Point Force and Orbit subgraphs.

Lillie is the Keeper 1.2 Released

LitK 1.2 is now available on the App Store. It features:

  • Revamped kitchen with period furniture
  • Enhanced UI
  • Improved AR setup experience
  • New event system to support current & future game enhancements
  • Refactored, more reliable UI system
  • New intrusive thoughts
  • More Acre Courier content
  • Bug fixes & small visual improvements throughout

Baking Textures From Bforartists/Blender to Unity

Unlike many tasks, Blender’s powerful but bafflingly-designed texture map baking tools are not improved by its UI-focused fork, Bforartists. Here’s how to use them, with a downloadable example (the Hoosier Cabinet from Lillie is the Keeper‘s upcoming 1.2 release). Click here to download the cabinet model with textures.

Honestly, if I haven’t done it for a while, I can’t remember how the process works, so this will be as much a cheat sheet for me as you.

Bake options, in the Render Properties sidebar

Since it’s not our focus, we’ll skip quickly through the high-to-low-poly sculpt method. You create a low-polygon model, unwrap its UVs, and make a duplicate (using Duplicate Objects, not Duplicate Linked) in the same world position. Hiding the original, you sculpt a high-detail version, to be baked onto the original as (at minimum) a Normal Map. You can also use Bforartist’s Shading tab to create texture, displacement and color, even applying multiple materials to your high-resolution mesh. Since the UVs don’t have to match between the high and low-poly versions, effects like differently-oriented wood grain are just a matter of mucking with the high-poly sculpt’s UVs.

I’m also largely assuming you’re familiar with importing textures into Unity.

Baking a Normal Map

An Image Map node
  1. In Object Mode, select the high-poly model.
  2. Multiple-select the low-poly model as well (on Mac, Command-click, in the scene hierarchy window).
  3. Under Render Properties (the camera icon in the sidebar) set Render Engine to Cycles. (Baking is not supported in Eevee or Workbench.)
  4. Farther down in the Render Properties sidebar, click to expand the Bake options.
  5. Set Bake Type to Normal.
  6. Check Selected to Active (if it’s not already checked).
  7. For Extrusion, enter the minimum distance the baking system will need to “puff out” your low-poly model’s surfaces to completely enclose your high-poly sculpt. (This may take a few goes, before you hit on the right distance to catch everything without artifacts.)
  8. Max Ray Distance I usually set to twice the Extrusion.
  9. Under Target, select Image Textures.
  10. (Wait, shouldn’t there be a control here to select which texture to bake to? Yes, but, Blender.)
  11. So instead, switch from the Main tab to the Shading tab.
  12. Make sure your low-poly model has a Material of its own.
  13. In the View window, create a new texture (Image menu: New) or open an existing texture to overwrite (Image menu: Open…).
  14. Down in the texture nodes window, click Add: Texture: Image Texture, and drop the node into your workspace. Don’t bother connecting it to anything.
  15. In the Image Map node’s Image Browser menu (picture icon) select the Normal Map texture you’ve just created/opened.
  16. Select the Image Texture node itself. For some reason, this is how you choose which texture to bake to. If an Image Texture node is hilighted in the node workspace, baking will write to it. BE CAREFUL: It’s easy to accidentally bake to the wrong texture, if you’re using texture maps in your Material.
  17. Back in the sidebar, click that big Bake button.
  18. If the results are good, save the texture map from the View window. (Image menu: Save Image.)
  19. If not, play with your Extrusion parameter, and/or your sculpt geometry. Make sure your normals are facing the right way. (The Normal Map baker doesn’t seem to ignore backfaces.)

Baking an Occlusion Map

Ambient Occlusion darkens cracks and crannies in your surface. Most importantly, it keeps the inner edges of your Normal Maps from reflecting a bunch of weird phantom light.

It’s possible to bake Ambient Occlusion (as well as Diffuse and others) from a single model to itself–no high-poly needed. To do this, simply select the model, go to the Bake options and uncheck Selected to Active.

  1. Follow the same procedure down to Step 5, but now set the Bake Type menu to Ambient Occlusion. If baking high-to-low, keep the same Extrusion and Max Ray Distance settings that worked for your Normal Map.
  2. Follow the rest of the procedure above. You can reuse the Image Texture node, just make sure to switch it to the Occlusion Map texture you create/open. Remember to save the texture after the bake.

Baking a Diffuse Map

A baked Diffuse Map, in the Shader tab’s View window

If you’re comfortable with the node-based shader system, and applying multiple materials to a mesh, you can do a lot of interesting texturing in Bforartists.

Typically, you’ll be baking Diffuse maps with a single model, rather than high-poly-to-low. Again, simply select the model, go to the Bake options and uncheck Selected to Active.

  1. Follow the same procedure down to Step 5, but now set the Bake Type menu to Diffuse.
  2. Chances are you don’t want to bake your shadows and global illumination into the Diffuse Map. Under Contributions, uncheck Direct and Indirect.
  3. Follow the rest of the procedure above, with the same reminders.

The Prestige: Baking a MOxS Map

A channel-packed “MOxS” texture for URP

This one’s on Unity. In the Universal Render Pipeline (URP), there’s an awkwardly not-quite documented way to pack three monochromatic textures into one, saving two texture taps per draw (and textures in memory). The four channel (RGBA) texture packs a Metallic Map, Occlusion Map, an unused channel, and a Smoothness Map into one, I guess, “MOxS Map.”

  1. First we’ll need to bake a Metallic Map. Unfortunately, the Bake Type menu has no such option. We’ll get around this in the Shading node editor by temporarily piping our Metallic value into the Emission input (since it’s the least likely to already be in use) and baking an Emit Map:
    • For each Material on the model…
    • If there’s a node piped into the Principled BSDF node’s Metallic input, pipe it into the Emission input instead.
    • If it’s a constant value, just copy it into the Emission color’s V value (in HSV mode).
    • Set the Bake Type to Emit.
    • As above, create/open a texture to overwrite with the Metallic Map, select it in the Image Texture node, hilight the node, and click the Bake button. Save the texture.
  2. Next we need an Occlusion Map. Disconnect/revert any Emission changes you made in your Materials to create the Metallic Map, and bake an Occlusion Map as above.
  3. Finally, we need a Smoothness Map. We can’t natively bake this either, but we can bake its (literal) inverse:
    • Set Bake Type to Roughness.
    • Create/open a texture, bake the Roughness Map to it, and save.
  4. Combine all three textures in Photoshop (as follows) or another image editor:
    • Create a new document the same dimensions as your textures.
    • Open the Metallic Map texture, Select All, and Copy.
    • In the new document, click on the Channels tab, select the Red channel, and Paste.
    • Open the Occlusion Map texture, Select All, and Copy.
    • In the new document, select the Green channel, and Paste.
    • Open the Roughness Map texture, Select All.
    • Invert the image (Image menu: Adjustments: Invert) and Copy.
    • In your new document, create an Alpha channel (plus-in-a-box icon, at the bottom of the Channels tab).
    • Select the Alpha channel, and Paste.
    • Save as a Photoshop file. (PNG may try to knock out parts of the image based on the alpha channel, and Unity imports Photoshop files well.)
  5. In Unity, select the “MOxS” texture. Make sure sRGB (Color Texture) and Alpha is Transparency are unchecked. (This is a data texture, so we don’t want to adjust the colors to the current gamut, especially in a Linear Color Space project.)
  6. In your Unity Material (using the Lit, or Complex Lit shader) assign your MOxS texture to the Metallic Map input. Leave the Smoothness slider alone, as it won’t affect anything, and make sure Source is set to Metallic Alpha.
  7. Finally, plug the MOxS map into the Occlusion Map input as well. (The workflow feels off, but the shader seems to recognize the packed green channel as the one to use.)

Useful Unity Components: PlaySounds

In this ZIP file you’ll find three C# scripts for Unity: PlaySounds.cs, PlaySoundsMultitrack.cs and PlaySoundsBySpeed.cs. The latter two are subclasses of PlaySounds.cs, and require the former in your project. These small, lightweight scripts are used throughout Lillie is the Keeper (along with a couple other subclasses that are dependent on features of the game).

Basically, they do everything that I wish Unity’s own AudioSource Component did by itself. Play through a list of AudioClips? No problem. Play a random clip from a list? Done. Play OnTriggerEnter()? One click. You can play a single clip or all clips, disable the GameObject after playing, trigger audio from an external script, and monitor playing status with UnityEvents or a simple bool.

Check the scripts’ headers for a full rundown of features and how to use them. You’ll also see helpful tooltips in the Unity Editor.

The two subclassed scripts, PlaySoundsMultitrack and PlaySoundsBySpeed let you do two additional things. With the former, you can swap between up to four wholly different sets of AudioClips. Think of a windmill randomly playing different sounds from a playlist at different speeds: a slow, creaky set of sound clips at lower speeds, and a higher, whooshier set at high speed. PlaySoundsBySpeed lets you specify a minimum velocity at which to trigger sounds, and scales the volume up from 0 to 100% at a maximum speed. (Setting both speeds equal always plays the sound at normal volume.)

There are a couple things you may want to customize. These are written for rapid prototyping, trying things out, and generally seeing what works. Just about everything that can be public is, rather than using [SerializeField] private. If there are no AudioClips in the list, PlaySounds will simply disable itself; you may prefer to throw an error. Additionally, you’ll notice that an AudioSource component is necessary, but not required in code via [RequireComponent(typeof(AudioSource))]. (Instead, PlaySounds logs the issue for you.) This is deliberate, to keep Component coupling loose while trying things out, but may not be what you want in production.

I encourage you to use these, without limitations, in your own work. (But if you do something cool, please do let me know!)

LitK: Here Goes Nothing

This is it! “Lillie is the Keeper,” the innovative new AR adventure game for iPhone & iPad, is live…

And best of all, it’s FREE, through January 6!

Explore your own virtual lighthouse, and live Lillie’s story. I’m proud to bring this playable short story to you, and hope you enjoy your time at Switch Rock Light Station.

Download on the App Store

Tiling 3D Noise in Blender

My game “Lillie is the Keeper” needed a small-scale ripple texture for an ocean shader. The in-game shader makes use of a 3d texture, UV sampled in world space horizontally, with the sampler moving up through the texture’s z axis over time to animate it. A first version used the old 4d rotation trick for repeating noise, in which a 4-dimensional noise texture is rotated 360 degrees around the Lovecraftian W axis between UVs 0 and 1. It tiled horizontally, but when the 3d texture repeated (up the z axis) there was an ugly little crossfade between unrelated frames.

I use Bforartists, a UI-focused fork of Blender, for 3d graphics and some texture creation–like this project. It doesn’t fix every pain point, but I can’t recommend it highly enough. This method, and the attached .blend file, will work just the same in mainline Blender.

As there’s no 5d noise function in the shader nodes (Shading tab), for the improved ripple texture we must go back to fundamentals. Ken Perlin’s original version of a solid texture–Perlin Noise–has a lot of complicated math behind it, but geometrically is pretty straightforward: Create a 3d grid, place a point at a pseudorandom location within each box, and apply a function to smoothly interpolate between the points in all three dimensions. (As best I understand it, Perlin’s own improvement, Simplex Noise, replaces the grid with tetrahedrons–triangular pyramids–but that’s at least Whisperer-in-Darkness-grade math for me.)

There is a shader node that can perform similar interpolation: Point Density. Note that you’ll have to switch your renderer to Cycles in order to use it. In Eevee it’ll just output black. This is poorly documented and the interface won’t help you.

The Point Density node takes the vertices of a mesh (or particles of a particle system) and outputs a greyscale representation of their density. With it, it’s possible to create noise from your own handmade 3d grid of points–like an array of cubes. Since you’re creating the vertices yourself, making them repeat is as simple as replicating them in x, y and z–for instance, with three Array modifiers.

To start out, create a 1x1x1 cube. Pop into Edit Mode and set the cube’s origin to its leftmost, bottom-most, hindmost vertex. Back in Object Mode, move it to -1.5, -1.5, -1.5. Just one cube (8 points) won’t look like much as noise, so we’ll double it in all three dimensions. Scale your cube to 0.5, 0.5, 0.5. Now double it in X, Y and Z with three Array modifiers: Add an Array modifier, set the Count to 2, and the Relative Offset’s Factor X to 2. Add two more Array modifiers, for the Y and Z axes (Relative Offset Factor Y to 2 on the second, and Z to 2 on the third).

We can randomize our cubes’ vertex positions with another modifier: Displacement deforms the mesh based on a texture. The app can generate this noise texture for you. Go to Texture Properties, create a new Texture, set the type to “Clouds,” and select “Color” rather than “Greyscale.” Go back to your cubes’ modifiers, add a Displacement modifier (after the three Array modifiers), set the Coordinates to “Global,” the Direction to “RGB to XYZ” and the space to “Local.” Play with the Strength and Midlevel properties if you want more distortion in your cubes.

A cube of (distorted) cubes

Now you’ve got a box of eight distorted cubes, sort of down in the lower left-hand corner of the world axis. Let’s replicate them with three further Array modifiers: After your Displacement modifier, add an Array, and set the Count to 3. Since the Displacement modifiers are messing with the overall dimensions of the cubes, deselect Relative Offset and select Constant Offset. Set Distance X to 2. Now, make two more Array modifiers, for Y and Z, also with Constant Offset Distance 2 (in Y and Z respectively). You should now have a repeating set of distorted cubes in all 3 dimensions.

Hide your cubes (including from renders–the camera icon in the outliner). Create a 1×1 plane at the origin. Delete any lights in your scene. Set the camera’s output to a texture-friendly square, like 512×512 pixels (printer icon, Resolution). Set the camera to Orthographic (camera icon, Lens: Type) and aim it straight on to your plane. Create a new Material on the plane (material ball icon, New).

Switch to the Shading tab with your new Material, delete the “Principled BSDF” node, and instead add a “Point Density” node. Select “Object Vertices” rather than “Particle System.” Under Object, select your mesh of repeating distorted cubes. Set the Space to “World Space,” the Radius to 0.5, and the Interpolation to “Cubic.” Drag the node’s Density output straight to the Material Output node’s Surface input.

Shader nodes

That’s it, in a nutshell. Move your plane between -0.5 and 0.5 Z, and the noise pattern will repeat. It’ll also wrap around at the X and Y edges.

You can create finer-grained noise by doubling your cubes and halving their scale. Or double-doubling and half-halving. Or double-double-doubling… You get the idea. For each iteration, half the scale, double the Count of your first 3 modifiers, then double the distance between them with your last 3 modifiers. I also recommend halving the Size attribute of your Clouds texture each time.

Note that at larger numbers of cubes (say 16 or 32 per box) the “Point Density” node’s Resolution attribute will need to be increased. Add 100 at a time, until there’s no visible difference adding 100 more. Accept the hit in performance. (If you see seams in the final rendered texture, too low a Resolution setting here is usually the culprit.)

Download the Bforartists/Blender file here: Repeated Tiled Noise v2.blend

In the file, the Demo collection will demonstrate the simple version, while the Production collection is my final water ripples setup. In the simple version, there are also some unused shader nodes demonstrating a setup for combining noise at different scales to create more complex output–again, just like Perlin Noise.

The Production collection, my own version for water ripples, does a few additional things. I’m creating the 3d texture in Unity, which requires each frame (vertical slice of the 3d noise) to be stacked side-by-side in a single image file. As such, I’ve added an Array modifier to the plane I’m rendering, so that it becomes 16 side-by-side squares stepping up between -0.5 and (almost) 0.5. (Almost 0.5, because step 17 would be 0.5–and we don’t want to repeat a frame. The app will do basic math when setting fields numerically, so entering “1/16” will give you 0.0625…) The orthographic camera is adjusted to render it all in one frame. Within the shader graph, I’ve made the position Vector fed into the “Point Density” node a combination of the world Z coordinate and the planes’ UV coordinates (standing in for X and Y). Since I want a lot less change as the noise loops in the Z axis than in the horizontal directions, I’ve not halved the number and scale of the cubes along the Z axis, and I’ve split the Displacement modifiers’ textures into three different Greyscale textures accordingly. There’s an RGB Curves node added after the Point Density node to make the interpolation more wave-like. Finally, the greyscale heights have been translated into a Normal Map via a Bump node.

Smooth sailing!

Solus: 2.5D Character Control & Footprints

The protagonist (we never came up with a name for her) moves along a 2D plane in a 3D environment, with generally realistic platforming movement inspired by Flashback: The Quest For Identity.  The system uses the Unity physics engine, manually controlling the character’s momentum to create grabbing and climbing, and adds quadratic drag for “crunchier” falling per Bennet Foddy’s 2015 GDC lecture. I started by modifying an existing character control script, the final system ended up a complete rewrite.

Character interaction is controlled with Layers. If an object has a Collider and is in Layer “Walkable,” the protagonist can traverse it, including ledge grabbing when appropriate. Rope climbing is the same, only with Layer “ClimbableRope.” (Wall climbing was also implemented, but cut for time.)

Want to play with it? You can download the Unity package here. Feel free to use the controller scripts & prefab setup for whatever you’d like (but not Anastasia Jacobsen’s cute character model please!)

Footprints are based on the method used in Röki. At the animation frames of the walking and running cycles where the foot first makes contact with the ground, an animation event is called with a boolean indicating left or right foot. A Projector Prefab with a Normal Map Texture is then instantiated at the location of the foot’s Bone. The Prefab has its own script, which fades the Normal Map out over 10 seconds, and then self-deletes.

The Solus demo is available on to download and play on Itch.io (Mac & Windows).

Solus: Lighting Up the Desert

Anastasia Jacobsen’s concept for Solus is an attempt at a semi-hard-sci-fi take on Alex McDowell’s “Planet JUNK” collaboration. The Earth has somehow stopped rotating, creating a 6 month summer/winter cycle and migrating the oceans away from the equator.

Logo art by Anastasia Jacobsen

In the demo, the player journeys down into the sand-buried remains of a skyscraper looking for water. For visual interest (and irony) I suggested the Futurist city of Brasilia which went over well with the team: Niek Meffert, Anastasia Jacobsen, Rosa Friholm, Ida Lilja, and myself. I was Technical Artist and Lighting Designer. (Solus was the first of two Planet JUNK collaborations. Many lessons learned were later applied to Shrooms.)

Solus uses Unity’s High Definition Rendering Pipeline (HDRI), allowing a wide variety of realistic volumetric effects—the simulation of light’s interaction with microscopic particles suspended in air, like smoke, water droplets and dust.

Desert scenes may never escape from Journey’s long shadow…

Topside, the lighting is very simple. There’s a Directional Light (sun) and not much else. Fill lighting is created by Global Illumination from the skybox. Blowing sand is created with the Unity VFX Graph. A number of post-processing effects are added, including Bloom, Tonemapping, Color Curve adjustments (for a more cinematic “desert” look) and a custom sparkle shader in the brightest areas. A faint volumetric Fog pervades the scene, to create a dusty atmosphere. Slightly behind the main plane of action, a second “thicker” Fog Volume is added, faded from bottom to top, to make the background distances appear greater and create a Bryce-like height fog effect.

Thank you, anonymous graffito

The underground lighting is primarily driven by a Point Light attached to the character’s lantern. The Volumetric Fog is thicker, increasing with depth into the buried skyscraper. An extremely bright Spot Light shines in through the entrance, volumetric and colored bright blue to contrast with the warmer lantern light. A similar, very narrow bright blue Spot Light shines down from the top of the first elevator shaft, as if a tiny stab of sunlight were blazing in through a chink in the roof. Farther down, mushrooms glow with an eerie green Emissive Material, casting light onto their surroundings via covert green Area Lights.

The theatrical darkness demanded that a final Light be added, to only be activated while editing the scene—literally named “Work Light.”

The Solus demo is available on to download and play on Itch.io (Mac & Windows).

Shrooms: HDRP in URP

The Shrooms demo runs on Unity’s mobile-friendly Universal Render Pipeline (URP), which doesn’t support volumetric fog and lighting like the High Definition Rendering Pipeline (HDRP). An early design decision was to lock the camera to only about 20 degrees of rotation off the default view axis. This allows many computationally-inexpensive (oldschool) cheats and tricks to create rich atmosphere. My mantra was: “HDRP in URP.”

Lighting

 In the Shrooms world, lightbulb is a job. Every light source is a glowing, bioluminescent mushroom person. The Copenhagen-inspired strings of street lamps that draw the viewer through the level each contain an animated Bulb Guy (created by Niek Meffert) sitting in a little wire gondola underneath a beat-up reflector. It’s a living.

He/she, and the remainder of the lamp, are set to not cast shadows, and contain a downward-facing  Spot Light. There are 37 in all, in addition to a wan Directional Light sun from the left—which is a problem, because Unity’s URP has a hard limit of 8 lights per mesh. The Unity Terrain tool splits the ground into a couple dozen smaller tiles, but the initial result was most of the light sources being simply ignored by the ground mesh, and glows often visibly sliced off where they crossed tile boundaries. Baked Lightmaps and realtime lighting in URP both share the lights-per-mesh limit.

Quick & dirty normal map in Photoshop: Filter > Other > High Pass, Filter > 3D > Generate Normal Map

The solution was to place pieces of flattened human-world junk along the ground, to disguise the boundaries and ensure that every light creates a visible effect. The junk shader uses a Texture stitched together in Photoshop from derelict building photographs, with a rough Normal Map.

Like the noise functions, the Texture is applied in World Space, allowing the same low-res crumpled square of debris to be recycled, stretched and resized ad-nauseum, with the Texture remaining undistorted and matching up perfectly at object boundaries. I’ve been a big fan of using world space shaders to create visual variety in instanced models since The House of Time–which, yes, will finally get some big updates this summer.

Simple exponential-squared Distance Fog ties the effects together, creating additional depth, and a Bloom post effect softens the edges of windows and other bright objects to match. A Depth of Field post effect further softens objects in the extreme foreground, adding to the murky intimacy, and the deep background is a hand-painted backdrop by Natasha Beck in an Unlit Shader.

HDRP in URP: A mix of simple, oldschool tricks and modern GPU-driven effects.

Faking Volumetrics

 It’s a not-so-dirty not-so-secret that even in high-end film compositing software volumetric lighting is faked by slicing the camera’s Z-axis into stacked, transparent planes at render time. This is what Shrooms does manually. Using the limited camera view and careful placement, patches of fog are created with a shader on a small stack of transparent planes. The shader multiplies a half-circle gradient alpha Texture with a procedural noise function. The noise slowly migrates up the Y-axis, as if mist were rising off the swamp. The noise is generated in World Space, so that scaling, squashing or stretching the fog planes creates no distortion to the noise pattern.

Light glows work the same way. Each light fixture model contains a set of three  planes: Two larger, colored, more transparent ones in front and back, and a smaller, more opaque, white plane in the center. The alpha Texture is a narrow cone gradient, aimed downward, and the World Space noise function slowly falls, like misty drizzle. The bright spotlights in the arena and cafe are just variants on this scheme, and a circular glow is used in a couple of additional spots.

Shrooms: Color & Forms

In Niek Meffert’s concept for Shrooms, giant mushroom people battle giant plant people in their swampy homeland, while grinding the remnants of humanity under their figurative boots. The dev team was Meffert, Lucas Oliveira, Sabrina Christiansen, Kaspar Dahl, Natasha Beck, and myself as Lighting Designer and Technical Artist. You can check out the demo (Mac & Windows) on Itch.io here.

Frequently heard during environmental modeling: “It’s good, Sabby. Get rid of the straight lines.”

The objective was to create a bright, colorful, murky, fungal setting. Fungus suggests bright, “sickly-sweet” tertiary colors, and we wanted an organic, lively scene. However, with too much clashing color the scene would have become busy and unreadable. Just finding your way and knowing what to interact with would have meant a frustrating cognitive load.

For that reason, I worked with the team to enforced certain rules to control user attention. The main character is in complementary colors. The bad guy’s color palette is a high-saturation split complement. NPC characters each have a single, dominant color. Non-interactive parts of the scene favor muted, analogous colors.

Lighting rules were also held to. Unimportant parts of the level fall back into mist and shadow. The character path is comparatively well lit, always suggesting where the player can and can’t go. Interactive parts of the scene (usually just-for-fun destructibles) pop comparatively, while others harmonize.

Forms avoid straight lines, with blobby, asymmetrical and impractical shapes but—importantly—recognizable outlines. Classic Warcraft games, and the art of Chris Sanders (Lilo & Stitch) were strong references here.

And of course, what’s the point of a game without asshole physics?

Basilicum on Reddit – Raffle Extended!

Within the hour, I’ll be posting a Unity WebGL game to Reddit, in hopes of collecting a statistically meaningful sample of responses to a questionnaire. In addition, through Tuesday, June 8 at 22:00 CET I’m conducting a raffle to encourage participation. This could go wrong in so many ways, and only right in one.

The characters’ anxious hand-wringing is my own.

Edit: The raffle is open! Click here to play the test app.

Performs best in Firefox and Chrome. Feel free to play the game as much as you’d like, but please only submit one survey form.

Terms and Conditions:

Persons over 18 who submit the survey between Friday, June 4 at 22:00 CET and 22:00 CET Tuesday, June 8, and enter a valid Steam profile name will be eligible for a raffle, to be conducted by June 20th, 2021.

-One first-place winner will receive a $75 USD digital gift card, sent through Steam.
-Two runners up will receive $25 USD digital gift cards, also sent through Steam.

The winners will receive a friend request from my Steam account, “rhinocrate” and receive their digital gift cards as a friend-to-friend gift.

I wish it weren’t necessary to say, but I must reserve the right to disqualify participants based on evidence of ballot-stuffing or other forms of inauthentic or abusive behavior. There is a limit of one entry per person. Steam accounts must have at least one purchased game to be eligible for the raffle. If fewer than 20 valid responses are received, the raffle will be cancelled. No data collected will be used by me or anyone else for any purpose beyond tabulating results and completing the one-time raffle, nor will personally-identifying information (including IP addresses and Steam account handles) be distributed.

Tiny Convoy: Scaling Back

It was always an ambitious project, and not everything made it over the finish line.

What Got Cut

Glowing=on made it into the game, but no useful HUD feedback

UI Feedback: There’s a lot happening behind the scenes that the game doesn’t explain well. Every “CPU” (the brains of the robot, but also a physical robot part in the game) has randomized stats: Processing, Memory, Inputs and Outputs. These special stats aren’t altered by Upgrades, but they can be boosted by being close to (“meshing with”) nearby robots with higher stats. Processing governs how often an AI-controlled bot can reevaluate its choices. Memory is how much you can’t see but can “remember”–the fog of war. Inputs allow you a certain number of sensors you can equip. Outputs allow a set number of moving parts you can control. Likewise, damage isn’t well described, although your damaged parts do noticeably work less well.

Multiple “Car” Robots: Everything the bots do is designed around being able to take up more than one tile, dragging parts behind like train cars. Sadly, none of this made it into the final game, making even the word “convoy” seem slightly out of place. Bots sitting on top of other bots, and being carried along is–as best I can tell–entirely possible even in the demo build, but without trailers there’s not much point to it. So, no, we don’t get to play Tiny Convoy: Fury Road.

The Conversation Grid: The idea was to coordinate with your convoy without using words. You’d click on a friend and their internal map (from the Pathfinder) would come up as a grid of little icons. You could click on things to give them “ideas,” or to dissuade them from doing something dumb. It would have fed into their AI, not as a command, but as one of the AI’s competing ideas, with a boosted weight–sort of like the forgotten but brilliant “Galapagos.”

That Said…

Tutorial subs came very late in the design process, when it was clear playtesters couldn’t understand much of what they could do in the game–breaking design pillar #1

There is the start of a fun little game here. The many interacting systems largely work as intended, and cross-talk in interesting ways. The whole visual and audio presentation is inviting and detailed. With more content, fine-tuning and iterative playtesting, this could easily become a very good game.

But, on to second semester!