RedFrame's lighting tends to look a bit different than most games. We achieve this unique look by generating most lighting externally using techniques inspired by pre-rendered architectural visualization. We set up and bake our lighting in Maya and Mental Ray rather than leveraging Unity's built-in lightmap rendering tools.
Our current workflow is a three-step process: generate lightmap UVs, bake direct and/or indirect lighting, and import the resulting lightmap images into Unity's existing lightmap system. In Part 1 in our series on lightmapping we'll explore the process of generating lightmap UVs.
Unity includes an automatic lightmap UV generation tool. This is a one-way process, and it would be impractical to transfer these UVs back into Maya. Regardless of this limitation, we take a philosophically different approach in our workflow. Where Unity embraces automated simplicity, we've chosen manual control. Our workflow produces two important advantages for us: it creates model files that contain intrinsic lightmap UVs that may be used by other applications and engines, and it offers deep control over how objects are divided at a face level which can produce higher quality results with fewer visual artifacts.
We begin our process of building lightmap UVs by merging environmental geometry into localized groups. The objects in each group will all share the same lightmap texture. Each of these groups are about a quarter the size of a room.
To optimize the use of texture memory in Unity, we don't want to generate lightmaps for every individual piece of geometry. Separate objects are able to share a single lightmap provided that none of their UVs overlap. To ensure that the objects share a unique UV space, we temporarily merge every object within a group into a single mesh.
Manual UV Layout
Once we have a single mesh for a group of objects, we first must create a new UV set for the mesh. We will want have individual control over the UV layout for both the color and light maps - the second UV set will be used for lightmapping.
In the newly created UV channel, run an automatic UV generation in Maya. This is done by selecting Create UVs -> Automatic Mapping from the polygon menus. The UV map that is generated is usually fairly efficient but it can be compacted further by cutting UV edges that form right angles and then running a Layout operation.
In this image, the areas circled are spots where it would be a good idea to cut UV edges:
Manually separating UV shells for smooth objects with hard corners such as crown molding, or softer organic shapes such as upholstered furniture, can also minimize artifacts in lightmaps. Artifacts can be further reduced by tweaking the position of individual vertices as needed.
Once the mesh containing a group of objects has had its UVs efficiently laid out, we break the mesh into smaller pieces so that they can be culled by Unity via frustum culling and occlusion culling. The most sensible way we've found to re-divide each group mesh is by material. We wrote a MEL script to automatically do this, and we've made it available here.
This script will separate a mesh into pieces based on its materials, and will place each piece into a group node containing all meshes that share the same lightmap UV space. The script is a little ad-hoc and will break if one of the contained materials is the default material. Any suggestions on how we might improve the script are welcome.
Unity uses two UV channels per mesh, the first for displaying color maps and the second for lightmaps. When importing a model, Unity will automatically read the mesh's second UV channel. Be certain to disable "generate lightmap UVs" in the model's asset import settings, otherwise Unity will overwrite the UVs that were manually laid out in Maya.
Check out the next part in our series: Lightmap Workflow, Part 2: Architectural Lighting