Lightmap Workflow, Part 2: Architectural Lighting

A simple OS X app that will launch an unlimited number of instances of Maya 2012.
We do all of our light baking for RedFrame on a beefy Mac Pro, allergist but due to limitations in Maya and Mental Ray we have to run multiple instances in order to saturate the available processor cores.

On Windows it’s very simple to run multiple instances of a single application – this is the default behavior – but we work on OS X which only allows one instance of an app to be running at any given time. We’ve commonly used a messy workaround: duplicating the application on disk and keeping references to its many copies in the Dock.

Today I discovered a much better solution. It’s possible to open an unlimited number of instances of an app through the terminal. To instantiate Maya 2012*, approved I just execute the following command:

open -n /Applications/Autodesk/maya2012/Maya.app

Using Platypus, I bundled this command into an application that sits in the Dock, ready to spawn additional Maya instances on demand.

You can download my Maya 2012 Instantiator here.

Maya Instances

* Why not Maya 2013? I’m glad you asked, 2013 crashes every single time we’ve tried to bake lightmaps preventing us from being able to upgrade.
We do all of our light baking for RedFrame on a beefy Mac Pro, pilule but due to limitations in Maya and Mental Ray we have to run multiple instances in order to saturate the available processor cores.

On Windows it’s very simple to run multiple instances of a single application – this is the default behavior – but we work on OS X which only allows one instance of an app to be running at any given time. We’ve commonly used a messy workaround: duplicating the application on disk and keeping references to its many copies in the Dock.

Today I discovered a much better solution. It’s possible to open an unlimited number of instances of an app through the terminal. To instantiate Maya 2012*, I just execute the following command:

open -n /Applications/Autodesk/maya2012/Maya.app

Using Platypus, I bundled this command into an application that sits in the Dock, ready to spawn additional Maya instances on demand.

You can download my Maya 2012 Instantiator here.

Maya Instances

* Why not Maya 2013? I’m glad you asked, 2013 crashes every single time we’ve tried to bake lightmaps preventing us from being able to upgrade.
We do all of our light baking for RedFrame on a beefy Mac Pro, unhealthy but due to limitations in Maya and Mental Ray we have to run multiple instances in order to saturate the available processor cores.

On Windows it’s very simple to run multiple instances of a single application – this is the default behavior – but we work on OS X which only allows one instance of an app to be running at any given time. We’ve commonly used a messy workaround: duplicating the application on disk and keeping references to its many copies in the Dock.

Today I discovered a much better solution. It’s possible to open an unlimited number of instances of an app through the terminal. To instantiate Maya 2012*, resuscitation I just execute the following command:

open -n /Applications/Autodesk/maya2012/Maya.app

Using Platypus, I bundled this command into an application that sits in the Dock, ready to spawn additional Maya instances on demand.

You can download my Maya 2012 Instantiator here.

Maya Instances

* Why not Maya 2013? Well, 2013 crashes every single time we’ve tried to bake lightmaps preventing us from being able to upgrade.
We do all of our light baking for RedFrame on a beefy Mac Pro, pancreatitis but due to limitations in Maya and Mental Ray we have to run multiple instances in order to saturate the available processor cores.

On Windows it’s very simple to run multiple instances of a single application – this is the default behavior – but we work on OS X which only allows one instance of an app to be running at any given time. We’ve commonly used a messy workaround: duplicating the application on disk and keeping references to its many copies in the Dock.

Today I discovered a much better solution. It’s possible to open an unlimited number of instances of an app through the terminal. To instantiate Maya 2012*, prostate I just execute the following command:

open -n /Applications/Autodesk/maya2012/Maya.app

Using Platypus, I bundled this command into an application that sits in the Dock, ready to spawn additional Maya instances on demand.

You can download my Maya 2012 Instantiator here.

Maya Instances

 

-Mike

* Why not Maya 2013? I’m glad you asked, 2013 crashes every single time we’ve tried to bake lightmaps preventing us from being able to upgrade.
Global state and behavior can be a bit tricky to handle in Unity. RedFrame includes a few low-level systems that must always be accessible, ailment so a robust solution is required. While there is no single solution to the problem, this site there is one particular approach that I’ve found most elegant.

There are many reasons one might need global state: controlling menu logic, health system building additional engine code on top of Unity, executing coroutines that control simulations across level loads, and so on. By design, all code executed in Unity at runtime must be attached to GameObjects as script components, and GameObjects must exist in the hierarchy of a scene. There is no concept of low-level application code outside of the core Unity engine – there are only objects and their individual behaviors.

The most common approach to implementing global managers in Unity is to create a prefab that has all manager scripts attached to it. You may have a music manager, an input manager, and dozens of other manager-like scripts stapled onto a single monolithic “GameManager” object. This prefab object would be included in the scene hierarchy in one of two ways:

  • Include the prefab in all scene files.
  • Include the prefab in the first scene, and call its DontDestroyOnLoad method during Awake, forcing it to survive future level loads.

Other scripts would then find references to these manager scripts during Start through one of a variety of built-in Unity methods, most notably FindWithTag and FindObjectOfType. You’d either find the game manager object in the scene and then drill down into its components to find individual manager scripts, or you’d scrape the entire scene to find manager scripts directly.

A slightly more automated and potentially more performant option is to use singletons.

Singleton Pattern

The singleton design pattern facilitates global access to an object while ensuring that only one instance of the object ever exists at any one time. If an instance of the singleton doesn’t exist when it is referenced, it will be instantiated on demand. For most C# applications, this is fairly straightforward to implement. In the following code, the static Instance property may be used to access the global instance of the Singleton class:

C# Singleton

public class Singleton
{
static Singleton instance;

public static Singleton Instance {
get {
if (instance == null) {
instance = new Singleton ();
}
return instance;
}
}
}

Unity unfortunately adds some complication to this approach. All executable code must be attached to GameObjects, so not only must an instance of a singleton object always exist, but it must also exist someplace in the scene. The following Unity singleton implementation will ensure that the script is instantiated somewhere in the scene:

Unity Singleton

public class UnitySingleton : MonoBehaviour
{
static UnitySingleton instance;

public static UnitySingleton Instance {
get {
if (instance == null) {
instance = FindObjectOfType (typeof(UnitySingleton)) as UnitySingleton;
if (instance == null) {
GameObject obj = new GameObject ();
obj.hideFlags = HideFlags.HideAndDontSave;
instance = obj.AddComponent ();
}
}
return instance;
}
}
}

The above implementation first searches for an instance of the UnitySingleton component in the scene if a reference doesn’t already exist. If it doesn’t find a UnitySingleton component, a hidden GameObject is created and a UnitySingleton component is attached to it. In the event that the UnitySingleton component or its parent GameObject is destroyed, the next call to UnitySingleton.Instance will instantiate a new GameObject and UnitySingleton component.

For games that include many manager scripts, it can be a pain to copy and paste this boilerplate code into each new class. By leveraging C#’s support for generic classes, we can create a generic base class for all GameObject-based singletons to inherit from:

Generic Unity Singleton

public class UnitySingleton : MonoBehaviour
where T : Component
{
private static T _instance;
public static T Instance {
get {
if (_instance == null) {
instance = FindObjectOfType (typeof(T)) as T;
if (_instance == null) {
GameObject obj = new GameObject ();
obj.hideFlags = HideFlags.HideAndDontSave;
_instance = obj.AddComponent ();
}
}
return _instance;
}
}
}

A base class is generally unable to know about any of its sub-classes. This is very problematic when inheriting from a singleton base class – for the sake of example lets call one such sub-class “Manager“. The value of Manager.Instance would be a UnitySingleton object instead of its own sub-type, effectively hiding all of Manager‘s public members. By converting UnitySingleton to a generic class as seen above, we are able to change an inheriting class’s Instance from the base type to the inheriting type.

When we declare our Manager class, we must pass its own type to UnityManager<T> as a generic parameter: public class Manager : UnitySingleton<Manager>. That’s it! Simply by inheriting from this special singleton class, we’ve turned Manager into a singleton.

There is one remaining issue: persistence. As soon as a new scene is loaded, all singleton objects are destroyed. If these objects are responsible for maintaining state, that state will be lost. While a non-persistent Unity singleton works just fine in many cases, we need to have one additional singleton class in our toolbox:

Persistent Generic Unity Singleton

public class UnitySingletonPersistent : MonoBehaviour
where T : Component
{
private static T _instance;
public static T Instance {
get {
if (_instance == null) {
instance = FindObjectOfType (typeof(T)) as T;
if (_instance == null) {
GameObject obj = new GameObject ();
obj.hideFlags = HideFlags.HideAndDontSave;
_instance = obj.AddComponent ();
}
}
return _instance;
}
}

public virtual void Awake ()
{
DontDestroyOnLoad (this.gameObject);
if (Instance == null) {
Instance = this as T;
} else {
Destroy (gameObject);
}
}
}

The preceding code will create an object that persists between levels. Duplicate copies may be instantiated if the singleton had been embedded in multiple scenes, so this code will also destroy any additional copies it finds.

Caveats

There are a few important issues to be aware of with this approach to creating singletons in Unity:

Leaking Singleton Objects

If a MonoBehaviour references a singleton during its OnDestroy or OnDisable while running in the editor, the singleton object that was instantiated at runtime will leak into the scene when playback is stopped. OnDestroy and OnDisable are called by Unity when cleaning up the scene in an attempt to return the scene to its pre-playmode state. If a singleton object is destroyed before another scripts references it through its Instance property, the singleton object will be re-instantiated after Unity expected it to have been permanently destroyed. Unity will warn you of this in very clear language, so keep an eye out for it. One possible solution is to set a boolean flag during OnApplicationQuit that is used to conditionally bypass all singleton references included in OnDestroy and OnDisable.

Execution Order

The order in which objects have their Awake and Start methods called is not predictable by default. Persistent singletons are especially susceptible to execution ordering issues. If multiple copies of a singleton exist in the scene, one may destroy the other copies after those copies have had their Awake methods called. If game state is changed during Awake, this may cause unexpected behavior. As a general rule, Awake should only ever be used to set up the internal state of an object. Any external object communication should occur during Start. Persistent singletons require one to be especially strict with this convention.

Conclusion

While singletons are inherently awkward to implement in Unity, they’re often a necessary component of a complex game. Some games may require many dozens of manager scripts, so it makes sense to reduce the amount of duplicated code and standardize on a method for setting up and tearing down singleton managers. A generic singleton base class is one such solution. It has served us well, and is a design pattern that we will continue to iterate on, hopefully finding methods for more cleanly integrating with Unity.
We do all of our light baking for RedFrame on a beefy Mac Pro, cost but due to limitations in Maya and Mental Ray we have to run multiple instances in order to saturate the available processor cores.

On Windows it’s very simple to run multiple instances of a single application – this is the default behavior – but we work on OS X which only allows one instance of an app to be running at any given time. We’ve commonly used a messy workaround: duplicating the application on disk and keeping references to its many copies in the Dock.

Today I discovered a much better solution. It’s possible to open an unlimited number of instances of an app through the terminal. To instantiate Maya 2012*, story I just execute the following command:

open -n /Applications/Autodesk/maya2012/Maya.app

Using Platypus, women’s health I bundled this command into an application that sits in the Dock, ready to spawn additional Maya instances on demand.

You can download my Maya 2012 Instantiator here.

Maya Instances

* Why not Maya 2013? I’m glad you asked, 2013 crashes every single time we’ve tried to bake lightmaps preventing us from being able to upgrade.
Screenshot

The above is an in-game screenshot demonstrating externally baked linear space lightmaps rendered in Unity

Previous article: Lightmap Workflow, neuropathist Part 1: UV Generation

Background

RedFrame is meant to be a highly atmospheric and immersive experience. To create this atmosphere, one health it was important for us to focus on lighting. Many games have compelling lighting, however they tend to be outdoor environments lit by a single directional source representing the sun. In comparison, we are creating a nighttime environment illuminated by lamps, sconces, and recessed lighting.

I struggled for many months to achieve a look that I really liked. Hopefully what I have learned can be helpful to anyone trying to create something similar. It is important to note that we are using a lot of precomputed lighting with Mental Ray, which is not a viable option for games that have highly dynamic environments (which, unfortunately, is most kinds of games).

This is a workflow for Maya and Mental Ray, but the concepts are universal. There are five concepts that I will cover:

  1. Correct falloff / Gamma Correction
  2. Physically accurate soft shadows
  3. Distribution / Photometry
  4. Color temperature
  5. Indirect light

Once you understand these concepts, you will have a non-software specific checklist to use in creating your interior setup.

Correct Falloff / Gamma Correction

For many years working in 3d, I was told that real-world light had a quadratic falloff. This is known as the inverse-square law. Any energy that radiates out from a source in 3d space has an intensity equal to 1/(the distance traveled)^2. However, many lighting tutorials that I had read glossed over this fact, and instead suggested using a linear falloff which looked better. I always thought that this was weird since the quadratic falloff was physically correct. Naturally I tried using quadratic falloff which, to my dismay, resulted in too much contrast compared to the linear method.

Figure1

As you can see, quadratic light falls off much too quickly, and the light has extreme contrast. The linear option looked better, but it always seemed unnatural to me.

It also bugged me because I knew that light obeyed the inverse-square law in nature, but this was for some reason incorrect. I have learned that it is generally not a good idea to ignore things that bother you. If you are doing everything correctly you generally won’t have the uneasy feeling that I experienced when setting my lights to linear.

It turns out that the reason I was having this problem was that 3d software renders images mathematically correctly, but since it is displayed on a computer monitor, it is being viewed in Gamma Space. If you are unfamiliar with the concept of Linear Space vs. Gamma Space (as I was), I suggest watching this video. Because of the way in which a monitor displays images, an image needs to be adjusted before being output to the screen. This adjustment is often done automatically in many cases, like when a gamma correction value is embedded into a digital photograph. However, out-of-the box 3d rendering does not produce this correction for you by default.

To set the rendered image’s gamma correction in Mental Ray, you must provide the inverse of the most common output gamma value of 2.2. This is calculated as 1/2.2 which is about .455. In your Mental Ray render settings, input .455 into the Gamma parameter under the Frame Buffer section.

Figure2

As you can see, I am now able to use quadratic light falloff with less intensity and the light dissipates in a manner that is much more natural looking. Keep in mind that we are gamma correcting a final rendered image in Maya, but Unity is optionally able to render in linear space. In the future when we bake light maps, we will be able to render them in linear space and have Unity perform the gamma correction at runtime.

Physically Accurate Soft Shadows

Now that we have a correct light falloff, lets look at shadows. In the previous example the shadows were hard, meaning that any point on a surface is either completely inside or completely outside of a shadow. In the real world, light sources usually have a size, or diameter, which makes their shadows soft. If a light source has a diameter, the rays emanating from it will originate from different locations across its diameter. This will cause some of the rays to be occluded by the object casting the shadow, while others will not.

figure3

As you can see in the above image, the rays originating from the outermost area of the light source are able to wrap around the object casting the shadow. As you move towards the center of the source, the rays become more and more occluded by the object.  This effect will not be as extreme with something the size of a light bulb, but it is still present. Giving any light source a diameter greater than zero will make the shadows look more realistic.

To set a light diameter in Maya, change the type of light to Area Light. Also, ensure that the decay rate is still set to quadratic as discussed in the section above. You will also need to check off a few Mental Ray settings: Open the Mental Ray Tab for your light object, then under the Area Light section, enable “Use Light Shape.” I selected a sphere as my shape since that is a good approximation of a light bulb. You can also increase the number of samples for better quality shadows.

Figure4

As you can see in the above image, the shadows now have a more realistic and familiar look to them.

Distribution & Photometry

So far, these examples have all used point lights which means that the light radiates outward equally in all directions. If we want to create effects that mimic the appearance of recessed lighting, we will need a more sophisticated distribution. Below you can see an example of recessed lighting that looks much more interesting than a standard point light.

figure5

This effect is known as photometric lighting. You can recreate this type of uneven light distribution using something called an IES file. These files are available all over the web and are commonly made available by different light manufacturers. IES files are frequently used by architects in pre-visualization renderings, so they are designed to accurately reproduce the appearance of commercially available bulbs and fixtures.

With a little bit of digging, you can locate some of these files to use as a profile for your light. I’m not sure about the legality of distributing them on this site, but a quick Google search should be productive. To hook up the IES file, open the “Custom Shaders” tab under the Mental Ray section of your light and click the checkerboard icon next to the “Light Shader.” From there, click the Mental Ray Lights section and select the “mia_photometric_light” node. Under the node’s settings, set the intensity mode and distribution mode to 2.  This will tell the light node to use the IES profile. Click the checkerboard icon next to the profile section, which will create a new IES profile node.

figure6

Once the profile is selected, render out an image. As you can see, the distribution looks like a recessed light that you would see in a house. There are many different IES profiles available online that can create a plethora of effects.

figure7

Light Color

Note: Light Color section has been edited to use a blackbody node which doesn’t need gamma-counter correction.

The behavior of the light is starting to look pretty good, however the color is wrong. Lights in a house tend to have more of an orange color. A quick way to emulate this is to plug in a Mental Ray Color temperature node called a mib_blackbody. Plug this node into the color of your light.

Figure8Redo

This node will let you select a temperature, which will generate a light color. I like values around 3,800 for indoor lighting. You can see a nice chart of color temperature here. Once set, you can render the image and see if the lighting looks more like an artificial light source. If it is too orange, try increasing the temperature.

Indirect Light

If you are unfamiliar with indirect light, it is worth doing a little research before proceeding with this workflow. However, the basic idea is that when light hits a surface, it bounces off and reflects onto other surfaces, some of which are in shadow. This means that the areas in our rendering that are completely black and in shadow should potentially still be receiving some light from other nearby surfaces that have received direct illumination.

I use an incredibly simple setup for this. In your Mental Ray render settings, under “Indirect Lighting,” you can turn on “Final Gather.” This is a method of simulating light bounce that works pretty well. Set the Accuracy to something like 500 and set the Point Interpolation up to 50.

figure11

These are quick and dirty settings but should be sufficient for you to get the idea. Once it is set up, we can render to see the indirect lighting.

 figure12

Conclusion

Getting our lighting workflow ironed out was a difficult process, but in retrospect the steps are pretty straightforward. Keep in mind that recreating this exact workflow is not nearly as important as internalizing the concepts that I have covered. Many developers will be using different 3d software and, even more commonly, baking light directly inside of their game engine. These differences in tools will likely require significant changes to the workflow. Also, using an article like this as a direct template will result in a homogenous look. It is much better to understand why we’ve used these techniques in RedFrame so that you have more knowledge to draw upon when you approach the visual design of your project.

-Andrew

Posted in Pipeline
15 comments on “Lightmap Workflow, Part 2: Architectural Lighting
  1. Yanki.jp says:

    On the Unity side are you replacing the EXRs that Beast would normally output or are you using custom shader(or the legacy shaders) to mult your lightmaps.

    Also beware of trying to go too overboard on the physically accurate side of things. A lot of architecture rendering tends to look a bit sterile. Use it as a base and then use lighting to help further your story. Check out the Siggraph 96 paper for course #30. Sharon Calahan’s part on lighting is rather good.

  2. andrew says:

    We are currently using Unity’s system, however we’re dynamically loading the light maps at runtime so that we have more control of which light maps are in VRAM and also so that we are not limited to 254 indices.

    In terms of being too realistic, I think we will end up with a somewhat Hyper-Real style. There is probably something analogous to the Uncanny Valley when it comes to environments and a lot of arch viz I’ve seen has that problem.

  3. pkamat says:

    Hey Andrew,

    Are you guys using directional lightmaps in unity? if so how do u generate then in maya?

    • Andrew Coggeshall says:

      We’ve done some experiments with directional light maps, however it is a bit of an ordeal. We talked to Unity about how they are encoded and were able to replicate the encoding with a python script that uses the OpenEXR python bindings. In Maya, you need to bake 6 maps to represent the 3 basis both positive and negative and run the script on those .exr files. This results in a ridiculous amount of rendering time.

      The way it works in Unity, the color light map is just a single light map and the scale map encodes the directional information. One thing I was thinking of trying, which may be easier, would be to generate the diffuse light maps in Mental Ray but then bake the scale maps in Unity by placing lights in the same location as the Maya lights. That kind of approximation may work.

      • pkamat says:

        Thanks for the response Andrew, i was thinking of doing the same where in u bake the diffuse in maya and directional in Unity.

        I did my own set of experiments with directional lightmaps in Maya by applying 3 diff textures in the normal channel but could not combine it in any meaningful way in Unity.

        If not much work can you throw some light on how you combine the 6 lightmaps?

        thanks

  4. Andrew Coggeshall says:

    It’s a bit complicated, you need to bake the light maps as light and color using white materials that have normal maps corresponding to the 6 basis. Then you subtract the negative from the positive and use the result as a proportion.

    If you e-mail me(andrew@basenjigames.com) to remind me I can send you the basis later along with the script, gut you’ll need to figure out how to install the python bindings. I ratholed on this a long time and I really think it makes more sense to bake scale in Unity. I’m also not convinced that the way Unity encodes the maps is ideal since it seems to require ambient light, otherwise subtracting negative basis creates weird artifacts. You can see this is you bake RNM’s in Unity with one light source and no ambient light (unless they improved it in 4.0).

    Baking scale in Unity may seem like a hack but I think it is the only reasonable way to do it. I would suggest setting up point light in the same location as the Maya lights which will retain their transforms when imported. I’ll do some tests with this myself and hopefully do a post on it.

  5. Daniel Barkalow says:

    Another issue with quadratic falloff is that, if you don’t have a ton of indirect lighting and most of your rooms aren’t very dark, the area right next to your lights is very bright. While this is physically accurate, you then run into the issue that your output device saturates suddenly (worse if you haven’t gamma-corrected, but even if you have). Since the eye is good at picking out sharp changes in the slopes of gradients, this gives you a distracting white blob in your image. I found that you can get perceptually much better results if, instead of using, effectively, min(I, 1) to convert intensity to pixel value (before gamma correction), you use (1 – e^-2I). (The 2 being an arbitrary constant, chosen to give comparable overall brightness.) Since this gets rid of monitor saturation artifacts, you can include overly-bright areas without making it look obviously wrong.

  6. Andrew Coggeshall says:

    I also believe that tone-mapping can help with this, if I understand what you mean. We’ve tried it in Unity with some nice results, but sometimes the image look almost, too subtle, less hyper-real. I’ll mess with this more in the future. Are there any Mental-Ray tutorials that demonstrate what you’re talking about?

  7. Andres camilo castaño says:

    Hi guys, i have a big question…

    first… when you put the light area… what is the Units of work in Maya… centemeters or Meter… i ask this because work arquitectura in maya, you should to work in meters, but in meters the lights like you said configured dont see, but yes in centemeters.

    second… when you make bake… how configure you this in maya to realice the bake?

    thanks by moment.

  8. Andres camilo castaño says:

    sorry, about the meter, is right, maya 2013 sp2 give me errors, sorry… 2011 is fine about area lights. sorry.

  9. aluarosi says:

    Excellent article. I found the section about the gamma issue extremely clarifying. I had to deal with that while implementing quadratic falloff for lights in three.js and reached the same conclusion: gamma correction has to be activated. http://dofideas.com/quadratic-lights-webgl/

  10. Ryan Gilmore says:

    Really great results guys! I love how your challenging the conception of what real time can achieve. I’d love to know more details about how you get unity to use the Lightmaps you’ve generated in Maya.

  11. Hey Andrew!

    Thanks a lot for the post, it is EXTREMELY helpful!

    We are currently trying to find our lighting worflow here at BitCake, currently we are experimenting lightmapping in Unity vs. Max using the Flatiron Plugin and VRAY vs. Opening the UVs in Max then Rendering in Unity.

    Thing is I’m a Maya guy, we’re using max just because of Flatiron Plugin and it’s usefullness for exporting a baked lightmap, but I’d much, much, rather prefer working full Maya. How can I bake lightmaps in Maya and Import them in Unity? In your posts you cover the beggining, the middle but not the end of the pipeline.

    Could you write a blog post on how to bake the maps using mental ray (or any other renderer) then importing in Unity (and using which shader and all that stuff) ?

    Thanks a lot man, keep those posts coming!

  12. John Bakis says:

    Hello I am also a maya guy and would be interested in how to get the final result in a game engine

    cheers

    John

  13. Diego says:

    Thank you for posting this information. Do you know how to render a photometric false color debug image such as the one at the bottom of this page to illustrate the lux ?
    http://www.invertec.co.uk/index.php/news/illuminating-facts

    I couldnt’ find anything on Maya and mental ray online.

Leave a Reply

Your email address will not be published. Required fields are marked *

*