Baking Textures From Bforartists/Blender to Unity

Unlike many tasks, Blender’s powerful but bafflingly-designed texture map baking tools are not improved by its UI-focused fork, Bforartists. Here’s how to use them, with a downloadable example (the Hoosier Cabinet from Lillie is the Keeper‘s upcoming 1.2 release). Click here to download the cabinet model with textures.

Honestly, if I haven’t done it for a while, I can’t remember how the process works, so this will be as much a cheat sheet for me as you.

Bake options, in the Render Properties sidebar

Since it’s not our focus, we’ll skip quickly through the high-to-low-poly sculpt method. You create a low-polygon model, unwrap its UVs, and make a duplicate (using Duplicate Objects, not Duplicate Linked) in the same world position. Hiding the original, you sculpt a high-detail version, to be baked onto the original as (at minimum) a Normal Map. You can also use Bforartist’s Shading tab to create texture, displacement and color, even applying multiple materials to your high-resolution mesh. Since the UVs don’t have to match between the high and low-poly versions, effects like differently-oriented wood grain are just a matter of mucking with the high-poly sculpt’s UVs.

I’m also largely assuming you’re familiar with importing textures into Unity.

Baking a Normal Map

An Image Map node
  1. In Object Mode, select the high-poly model.
  2. Multiple-select the low-poly model as well (on Mac, Command-click, in the scene hierarchy window).
  3. Under Render Properties (the camera icon in the sidebar) set Render Engine to Cycles. (Baking is not supported in Eevee or Workbench.)
  4. Farther down in the Render Properties sidebar, click to expand the Bake options.
  5. Set Bake Type to Normal.
  6. Check Selected to Active (if it’s not already checked).
  7. For Extrusion, enter the minimum distance the baking system will need to “puff out” your low-poly model’s surfaces to completely enclose your high-poly sculpt. (This may take a few goes, before you hit on the right distance to catch everything without artifacts.)
  8. Max Ray Distance I usually set to twice the Extrusion.
  9. Under Target, select Image Textures.
  10. (Wait, shouldn’t there be a control here to select which texture to bake to? Yes, but, Blender.)
  11. So instead, switch from the Main tab to the Shading tab.
  12. Make sure your low-poly model has a Material of its own.
  13. In the View window, create a new texture (Image menu: New) or open an existing texture to overwrite (Image menu: Open…).
  14. Down in the texture nodes window, click Add: Texture: Image Texture, and drop the node into your workspace. Don’t bother connecting it to anything.
  15. In the Image Map node’s Image Browser menu (picture icon) select the Normal Map texture you’ve just created/opened.
  16. Select the Image Texture node itself. For some reason, this is how you choose which texture to bake to. If an Image Texture node is hilighted in the node workspace, baking will write to it.
  17. Back in the sidebar, click that big Bake button.
  18. If the results are good, save the texture map from the View window. (Image menu: Save Image.)
  19. If not, play with your Extrusion parameter, and/or your sculpt geometry. Make sure your normals are facing the right way. (The Normal Map baker doesn’t seem to ignore backfaces.)

Baking an Occlusion Map

Ambient Occlusion darkens cracks and crannies in your surface. Most importantly, it keeps the inner edges of your Normal Maps from reflecting a bunch of weird phantom light.

  1. Follow the same procedure down to Step 5, but now set the Bake Type menu to Ambient Occlusion. Keep the same Extrusion and Max Ray Distance settings that worked for your Normal Map.
  2. Follow the rest of the procedure above. You can reuse the Image Texture node, just make sure to switch it to the Occlusion Map texture you create/open. Remember to save the texture after the bake.

Baking a Diffuse Map

A baked Diffuse Map, in the Shader tab’s View window

If you’re comfortable with the node-based shader system, and applying multiple materials to a mesh, you can do a lot of interesting texturing in Bforartists.

  1. Follow the same procedure down to Step 5, but now set the Bake Type menu to Diffuse.
  2. Chances are you don’t want to bake your shadows and global illumination into the Diffuse Map. Under Contributions, uncheck Direct and Indirect.
  3. Follow the rest of the procedure above, with the same reminders.

The Prestige: Baking a MOxS Map

A channel-packed “MOxS” texture for URP

This one’s on Unity. In the Universal Render Pipeline (URP), there’s an awkwardly not-quite documented way to pack three monochromatic textures into one, saving two texture taps per draw (and textures in memory). The four channel (RGBA) texture packs a Metallic Map, Occlusion Map, an unused channel, and a Smoothness Map into one, I guess, “MOxS Map.”

  1. First we’ll need to bake a Metallic Map. Unfortunately, the Bake Type menu has no such option. We’ll get around this in the Shading node editor by temporarily piping our Metallic value into the Emission input (since it’s the least likely to already be in use) and baking an Emit Map:
    • For each Material on the high-poly model…
    • If there’s a node piped into the Principled BSDF node’s Metallic input, pipe it into the Emission input instead.
    • If it’s a constant value, just copy it into the Emission color’s V value (in HSV mode).
    • Set the Bake Type to Emit.
    • As above, create/open a texture to overwrite with the Metallic Map, select it in the Image Texture node, hilight the node, and click the Bake button. Save the texture.
  2. Next we need an Occlusion Map. Disconnect/revert any Emission changes you made in your Materials to create the Metallic Map, and bake an Occlusion Map as above.
  3. Finally, we need a Smoothness Map. We can’t natively bake this either, but we can bake its (literal) inverse:
    • Set Bake Type to Roughness.
    • Create/open a texture, bake the Roughness Map to it, and save.
  4. Combine all three textures in Photoshop (as follows) or another image editor:
    • Create a new document the same dimensions as your textures.
    • Open the Metallic Map texture, Select All, and Copy.
    • In the new document, click on the Channels tab, select the Red channel, and Paste.
    • Open the Occlusion Map texture, Select All, and Copy.
    • In the new document, select the Green channel, and Paste.
    • Open the Roughness Map texture, Select All.
    • Invert the image (Image menu: Adjustments: Invert) and Copy.
    • In your new document, create an Alpha channel (plus-in-a-box icon, at the bottom of the Channels tab).
    • Select the Alpha channel, and Paste.
    • Save as a Photoshop file. (PNG may try to knock out parts of the image based on the alpha channel, and Unity imports Photoshop files well.)
  5. In Unity, select the “MOxS” texture. Make sure sRGB (Color Texture) and Alpha is Transparency are unchecked. (This is a data texture, so we don’t want to adjust the colors to the current gamut, especially in a Linear Color Space project.)
  6. In your Unity Material (using the Lit, or Complex Lit shader) assign your MOxS texture to the Metallic Map input. Leave the Smoothness slider alone, as it won’t affect anything, and make sure Source is set to Metallic Alpha.
  7. Finally, plug the MOxS map into the Occlusion Map input as well. (The workflow feels off, but the shader recognizes the packed green channel as the one to use.)

Bunk – Downloadable .blend File

With the first of the planned updates out, I’ve improved some of the visual elements that were just sort of roughed in for the initial release. (Because if you wait until everything’s perfect, you’ll never ship.) I wanted to share a little of my process.

Really, Lillie. Make your own bed.

Here is the ZIP file. Everything is modelled in Bforartists, the UI-focused fork of Blender, and the file is 100% compatible with the mainline app. I use proxy Materials in Bforartists, as they’re easily replaced with native URP Materials in the .fbx file import dialogue. I find it more reliable to place unused/support items into a separate Scene Collection, and export the Active Collection from the .fbx export dialogue, rather than selecting items manually. Normally I do the bevelling by hand, but on the bunks I tried out the Bevel Modifier, followed by a light Decimate Modifier, to clean up unnecessary flat geometry. It works okay, and it’s quick, but the polygon count ends up higher than necessary. Because of the way Unity handles animations, the curtains are separate files.

The bunks are based loosely on photo references from the New Brighton Lighthouse. They’re meant to feel friendly and cozy, as they’re Lillie’s first sanctuary after the disastrous boat trip. In my head cannon (which I guess makes it the official cannon) we’re playing through Lillie’s memories, fears and anxieties. In “real life” she succeeded at all of these tasks the first time, but–like many of us–obsesses over what could have gone wrong. (Which are our in-game failures and resets.)

Rounded Realism

Having cut my 3D teeth on Hash Animation:Master, I like working with subdivision surfaces, until it’s time for a high-to-low poly sculpt for normal map baking. It’s always interesting how clothing only looks “right” when it’s modeled after a plausible sewing pattern.

Minus a few things out of my control, it looks like Lillie is the Keeper is only a week or two from release.

Without a AAA-scale production pipeline, an appropriate art style needed to be developed to maximize impact while easing development–especially since I’d be modelling everything myself. Low-polygon games like Black Book and Röki have had great artistic success, while voxel-style titles like Minecraft and Roblox have enjoyed great commercial success, both styles riding the “high-tech low-tech” aesthetic of my old professor. Having observed that modern iPhones (even in web browsers, via WebGL) can easily draw a very large number of polygons, I decided to lean instead into something I call “rounded realism.” This style builds on the work of artists like Aron Weisenfeld, Zinaida Serebriakova and Chris Van Allsburg, in which figures and objects are realistically textured and atmospherically lit, but conform subtly toward primitive solids.

In practice, rounded realism means that textures are realistic (photographic, when possible, utilizing Adobe Capture) and lighting is clean and realistic, but unimportant details are missing. Corners are bevelled, with sizable flat surfaces between. Visual outlines are clean and geometric, with a minimum of visual clutter. Faked volumetric lighting and other transparent elements are used extensively to create depth, running against the orthodoxy that their layered overdraw will kill iOS performance.

Clothing was some of the most demanding work. Mayme’s Edwardian sailor bodice outfit closely follows a custom build by Katja Kuitunen, based on a vintage piece from the era. It’s constructed from Kuitunen’s sewing references–with help from my girlfriend, who is brilliant with these things. The skirt (a separate piece, despite the shirt’s matching fabric) is a simple fabric tube, gathered about the waist, with realtime cloth simulation. Everything is designed to be plausible, but clean, geometric, and simplified.

It’s been a big job, and there’s more I’d like to do for a 1.1 release. The only models from the game that aren’t my own are human bodies built with MakeHuman, though even they’ve been resculpted and touched-up. Clothing and hair are all hand modeled.

Tiling 3D Noise in Blender

My game “Lillie is the Keeper” needed a small-scale ripple texture for an ocean shader. The in-game shader makes use of a 3d texture, UV sampled in world space horizontally, with the sampler moving up through the texture’s z axis over time to animate it. A first version used the old 4d rotation trick for repeating noise, in which a 4-dimensional noise texture is rotated 360 degrees around the Lovecraftian W axis between UVs 0 and 1. It tiled horizontally, but when the 3d texture repeated (up the z axis) there was an ugly little crossfade between unrelated frames.

I use Bforartists, a UI-focused fork of Blender, for 3d graphics and some texture creation–like this project. It doesn’t fix every pain point, but I can’t recommend it highly enough. This method, and the attached .blend file, will work just the same in mainline Blender.

As there’s no 5d noise function in the shader nodes (Shading tab), for the improved ripple texture we must go back to fundamentals. Ken Perlin’s original version of a solid texture–Perlin Noise–has a lot of complicated math behind it, but geometrically is pretty straightforward: Create a 3d grid, place a point at a pseudorandom location within each box, and apply a function to smoothly interpolate between the points in all three dimensions. (As best I understand it, Perlin’s own improvement, Simplex Noise, replaces the grid with tetrahedrons–triangular pyramids–but that’s at least Whisperer-in-Darkness-grade math for me.)

There is a shader node that can perform similar interpolation: Point Density. Note that you’ll have to switch your renderer to Cycles in order to use it. In Eevee it’ll just output black. This is poorly documented and the interface won’t help you.

The Point Density node takes the vertices of a mesh (or particles of a particle system) and outputs a greyscale representation of their density. With it, it’s possible to create noise from your own handmade 3d grid of points–like an array of cubes. Since you’re creating the vertices yourself, making them repeat is as simple as replicating them in x, y and z–for instance, with three Array modifiers.

To start out, create a 1x1x1 cube. Pop into Edit Mode and set the cube’s origin to its leftmost, bottom-most, hindmost vertex. Back in Object Mode, move it to -1.5, -1.5, -1.5. Just one cube (8 points) won’t look like much as noise, so we’ll double it in all three dimensions. Scale your cube to 0.5, 0.5, 0.5. Now double it in X, Y and Z with three Array modifiers: Add an Array modifier, set the Count to 2, and the Relative Offset’s Factor X to 2. Add two more Array modifiers, for the Y and Z axes (Relative Offset Factor Y to 2 on the second, and Z to 2 on the third).

We can randomize our cubes’ vertex positions with another modifier: Displacement deforms the mesh based on a texture. The app can generate this noise texture for you. Go to Texture Properties, create a new Texture, set the type to “Clouds,” and select “Color” rather than “Greyscale.” Go back to your cubes’ modifiers, add a Displacement modifier (after the three Array modifiers), set the Coordinates to “Global,” the Direction to “RGB to XYZ” and the space to “Local.” Play with the Strength and Midlevel properties if you want more distortion in your cubes.

A cube of (distorted) cubes

Now you’ve got a box of eight distorted cubes, sort of down in the lower left-hand corner of the world axis. Let’s replicate them with three further Array modifiers: After your Displacement modifier, add an Array, and set the Count to 3. Since the Displacement modifiers are messing with the overall dimensions of the cubes, deselect Relative Offset and select Constant Offset. Set Distance X to 2. Now, make two more Array modifiers, for Y and Z, also with Constant Offset Distance 2 (in Y and Z respectively). You should now have a repeating set of distorted cubes in all 3 dimensions.

Hide your cubes (including from renders–the camera icon in the outliner). Create a 1×1 plane at the origin. Delete any lights in your scene. Set the camera’s output to a texture-friendly square, like 512×512 pixels (printer icon, Resolution). Set the camera to Orthographic (camera icon, Lens: Type) and aim it straight on to your plane. Create a new Material on the plane (material ball icon, New).

Switch to the Shading tab with your new Material, delete the “Principled BSDF” node, and instead add a “Point Density” node. Select “Object Vertices” rather than “Particle System.” Under Object, select your mesh of repeating distorted cubes. Set the Space to “World Space,” the Radius to 0.5, and the Interpolation to “Cubic.” Drag the node’s Density output straight to the Material Output node’s Surface input.

Shader nodes

That’s it, in a nutshell. Move your plane between -0.5 and 0.5 Z, and the noise pattern will repeat. It’ll also wrap around at the X and Y edges.

You can create finer-grained noise by doubling your cubes and halving their scale. Or double-doubling and half-halving. Or double-double-doubling… You get the idea. For each iteration, half the scale, double the Count of your first 3 modifiers, then double the distance between them with your last 3 modifiers. I also recommend halving the Size attribute of your Clouds texture each time.

Note that at larger numbers of cubes (say 16 or 32 per box) the “Point Density” node’s Resolution attribute will need to be increased. Add 100 at a time, until there’s no visible difference adding 100 more. Accept the hit in performance. (If you see seams in the final rendered texture, too low a Resolution setting here is usually the culprit.)

Download the Bforartists/Blender file here: Repeated Tiled Noise v2.blend

In the file, the Demo collection will demonstrate the simple version, while the Production collection is my final water ripples setup. In the simple version, there are also some unused shader nodes demonstrating a setup for combining noise at different scales to create more complex output–again, just like Perlin Noise.

The Production collection, my own version for water ripples, does a few additional things. I’m creating the 3d texture in Unity, which requires each frame (vertical slice of the 3d noise) to be stacked side-by-side in a single image file. As such, I’ve added an Array modifier to the plane I’m rendering, so that it becomes 16 side-by-side squares stepping up between -0.5 and (almost) 0.5. (Almost 0.5, because step 17 would be 0.5–and we don’t want to repeat a frame. The app will do basic math when setting fields numerically, so entering “1/16” will give you 0.0625…) The orthographic camera is adjusted to render it all in one frame. Within the shader graph, I’ve made the position Vector fed into the “Point Density” node a combination of the world Z coordinate and the planes’ UV coordinates (standing in for X and Y). Since I want a lot less change as the noise loops in the Z axis than in the horizontal directions, I’ve not halved the number and scale of the cubes along the Z axis, and I’ve split the Displacement modifiers’ textures into three different Greyscale textures accordingly. There’s an RGB Curves node added after the Point Density node to make the interpolation more wave-like. Finally, the greyscale heights have been translated into a Normal Map via a Bump node.

Smooth sailing!

Closer to a HoT Beta

In a short while, the link to the venerable Bestiary of Geekdom up top will move to the sidebar, and be replaced with the House of Time.

This is a project I’ve been tapping away at for six months or so, on and off, and in an effort to play a little less of my usual gin rummy, I’m working toward soft-launching a public beta. It will be missing a lot of features and content, but should be a good start.

The 3d engine is built in Javascript on the Babylon.js WebGL framework. My goal is for it to run in all modern browsers–including mobile–with low-to-modest hardware requirements. There will be no loading screens, HUD or narration, no accounts or other tracking, no objectives or “gamification,” and certainly no ads. The House of Time will be free and available to all. If you’re old enough to remember the experience of Myst when it first came out, you’ll understand the quiet, contemplative, even lonely atmosphere I wish to create. Art as science as art. This is in furtherance of my personal philosophy that education should be free.

I’ve been designing a system that uses as little bandwidth as possible. Most interactive 3d is built around the expectations of PCs and consoles: That transfer is fast, storage is large, and the GPU is the bottleneck. Here that’s reversed. There will be zero texture maps. Shaders will supply most of the visual detail procedurally, generating it on the fly in your graphics card. SVGs will be rendered to bitmap in a hidden canvas element to supply more specific 2d imagery. Most of the shaders will rely on world space coordinates, so that two instanced models sitting side by side may look radically different. Instanced geometry will be used as much as possible. Complex extruded shapes will be generated in the browser from a path and cross-section. Chunks of geometry will load only when needed, and free their memory when no longer in use.

The overall scene (more than a mile long) is being built in Blender, as it plays well with Babylon.js and glTF export. Even with the UI improvements in the Bforartists fork, this has been a major pain point, and creation of complex 3d assets (dinos!) lags badly. (My preferred 3d package, Hash Animation:Master, has sadly become a paid zombie, with no meaningful updates this decade. The quest for a replacement continues…) I split the large scene into chunks manually and export them for browser loading with Babylon.js’s Blender export plugin. Tags in the names of models and lights are digested by the engine on load, to do things like assigning noise shaders, creating extruded shapes, or replacing a mesh with sprites.

This week, I’ve built a new stageManager object to move scenery on- and offstage and in and out of memory, as well as written a new pine foliage shader I’m reasonably happy with. Before going public, I still need to create and fix a few more things:

Assets

Membership/Info Card – Footstep sounds – Titanosaur – Low-res Ionic Pediment – Evergreens – Brick Walkways – Tree Ferns – Cambrian Marker – Beach – Stars

Engine

StageManagerNew evergreen shader – Sound manager

Bugs

Crash on deleting assetContainerFalling sprites not finding ground – Sprite systems not reusing correctly – Too much fog at start – Left-hand side of gate not animating – Miocene grass too short – Cretaceous Hall light wonkiness – No Carboniferous shadows