As I predicted in my previous post
this week was spent texturing the planetary terrain. Little did I expect to delve as deeply into Unity's graphics as I did though. It was a classic example of "one thing leads to another"...
The week started with me simply writing code to generate cylindrical mapping UVs alongside the vertices in order to get the basic texture stretched over the terrain to see how it looks. It wasn't bad
per say, but it wasn't quite good
The shadows were way too dark, which makes perfect sense really: there was no ambient lighting aside from the one coming from the milkyway skybox. It was most noticeable during sunset/sunrise. Take the following image for example:
As you can see, the entire terrain is just a black blob. The sky is still fairly well lit, meaning there should be plenty of indirect illumination coming from it that should light the terrain. So the question was, how to address it? Simple solution would be to change the ambient lighting from the space skybox to a color instead, but this approach wouldn't work well because it would give everything a flat, uniform color.
Another approach within Unity is to specify a gradient. This would work well, wouldn't it? Sure would, if the terrain was flat
. Perhaps I'm wrong, but I saw no matrix in there that could be set in order to actually transform the direction of the gradient. In other words, since there is no way to choose an "up" vector, this approach wouldn't work either.
The same problem prevented me from using Unity's "default skybox" also. There is simply no way to change the "up" vector.
This leaves just the static cubemap skybox-based approach that I was already using. If only there was a way to update the contents of this skybox in real-time based on what's around the player, right? Well, turns out there is. The legacy approach, which I started with first, was to just render into a cubemap and then use this cubemap in the shader somewhere. It was promising, but unfortunately cubemaps rendered this way don't have mip-maps, meaning LOD downsampling wasn't working. In case you wonder why this is necessary... think of indirect lighting as a very blurry 360 degree photo. If you don't blur it, then you will be able to see reflections of the world instead of soft ambient lighting. In fact, it's that last part -- reflections -- that triggered a thought within my head: what about reflection probes?
Unity now has a feature called reflection probes that you can sprinkle liberally throughout your scene in order to get much more realistic looking reflections on objects in your scene. The way it works is by rendering cubemaps at multiple points in the scene, and then simply blending between them in order to apply more realistic reflections to objects moving through the scene. There is something called "light probe group" as well, but it didn't seem to actually do
anything when I tried to use it, and in any case based on what I've read it seems they are geared more toward static scenes.
Reflection probes though -- they are most certainly not limited to being static. In fact, simply adding a probe to a blank scene immediately changes how objects using the Standard shader looks (assuming the material is smooth). Better still, it was possible to sample the reflection probe's cubemap at reduced LOD levels, thus getting that blurry ambient lighting cubemap I was looking for! So that's when I thought to myself: why not just have a real-time reflection probe follow the main camera around, rendering only the really big objects into it, such as the planet (using a simple sphere), its atmosphere, and the skybox itself? The result was immediately promising:
Unfortunately as I quickly learned, there were downsides that ultimately forced to me to write a custom BRDF. In technical terms, BRDF stands for "Bidirectional Reflectance Distribution Function". In more common terms, it's just a function that accepts various parameters related to the material and light and spits out the final color the material should have. Unity 5 started using them and there are 3 different versions available. I tried all 3, but neither had the look I was looking for. The default BRDF was most promising for well-lit scenes, but its handling of specular lighting was just flat out weird. Take this picture for example:
The tops of those hills are all uniform greyish color. In fact, as the sun's angle gets more and more shallow and the specular grazing angle increases, the material becomes more and more washed out and there is no way to turn this off. Take the following 100% black sphere as another example. Albedo and specular are both pure black, and yet when you look at it from a shallow angle to the light, starts becoming brighter and brighter:
Where is that color coming from? The light -- its specular contribution. But why isn't there any way to get rid of it? Not every material has such strong specular reflections. What about materials like vantablack
when it's time to create stealth coating? Or the more common case, just to eliminate specular highlights from terrain while leaving them intact on the water like this:
Worse still, the standard BRDF's ambient lighting comes from what seems to be a static source. That is, if the light moves creating different lighting conditions at run-time, the standard shader-lit materials don't seem to benefit. Take the following sphere placed in a blank scene with the default skybox and a reflection probe set to render only that skybox, for example. I simply hit Play then changed the angle of the light to that of a sunset:
The sphere is still lit like it was day time! That's obviously completely unacceptable. Playing around with the settings reveals that the reflections come from the reflection probe and look proper, but the global illumination term seems to come from a static texture of some kind, and as such doesn't change at run-time.
But hey, the best part about the new rendering pipeline is that it's fully exposed to be modified, so I got started on my own custom lighting path. The first thing I did was to simply scrap the "ShadeSHPerPixel" call in my copy of the UnityGI_Base function found in UnityGlobalIllumination.cginc with a call to sample the reflection probe's cubemap. Since I only have one reflection probe and don't plan on having more, I didn't need to do any blending and only sampled the first one:
// BUG: This flat out doesn't work when a skybox-based ambient lighting is used with the skybox updated at run-time
//gi.indirect.diffuse = ShadeSHPerPixel (normalWorld, data.ambient);
// This gives similar results and also happens to work properly
float4 diffuse = UNITY_SAMPLE_TEXCUBE_LOD(unity_SpecCube0, normalWorld, 0.8 * UNITY_SPECCUBE_LOD_STEPS);
gi.indirect.diffuse = DecodeHDR(diffuse, unity_SpecCube0_HDR).rgb;
This gave nice enough results, but just to make it extra blurry and eliminate the visible corners in the cubemap, I wrote code to do instant blurring:
// Fast, imprecise version:
//float4 diffuse = UNITY_SAMPLE_TEXCUBE_LOD(unity_SpecCube0, normalWorld, 0.8 * UNITY_SPECCUBE_LOD_STEPS).rgb;
//gi.indirect.diffuse = DecodeHDR(diffuse, unity_SpecCube0_HDR).rgb;
// Smoother but slower version:
float3 right = normalize(cross(float3(0.0, 1.0, 0.0), normalWorld));
float3 up = normalize(cross(normalWorld, right));
const float sampleFactor = 0.9 * UNITY_SPECCUBE_LOD_STEPS;
const float jitterFactor = 0.3;
float4 diffuse = (UNITY_SAMPLE_TEXCUBE_LOD(unity_SpecCube0, normalWorld, sampleFactor) +
UNITY_SAMPLE_TEXCUBE_LOD(unity_SpecCube0, lerp(normalWorld, up, jitterFactor), sampleFactor) +
UNITY_SAMPLE_TEXCUBE_LOD(unity_SpecCube0, lerp(normalWorld, -up, jitterFactor), sampleFactor) +
UNITY_SAMPLE_TEXCUBE_LOD(unity_SpecCube0, lerp(normalWorld, right, jitterFactor), sampleFactor) +
UNITY_SAMPLE_TEXCUBE_LOD(unity_SpecCube0, lerp(normalWorld, -right, jitterFactor), sampleFactor)) * 0.2;
gi.indirect.diffuse = DecodeHDR(diffuse, unity_SpecCube0_HDR).rgb;
With the global illumination taken care of, it was time to tackle the specular issues. I took care of that by creating a custom BRDF and extending the output struct's specular color to be a half4 instead of 3 -- that is, I simply wanted to take advantage of its unused alpha channel. In the last part of the BRDF function I made good use of it:
specularTerm = max(0, specularTerm * specColor.a * specColor.a * nl);
half diffuseTerm = disneyDiffuse * nl;
half grazingTerm = saturate(oneMinusRoughness * specColor.a + (1.0 - oneMinusReflectivity));
return half4(diffColor * (gi.diffuse + light.color * diffuseTerm) +
light.color * FresnelTerm (specColor.rgb, lh) * specularTerm +
surfaceReduction * gi.specular * FresnelLerp (specColor.rgb, grazingTerm, nv) * specColor.a, 1.0);
In simple terms, what this is doing is merely attenuates the specular's strength by the specular color's alpha, thus making it possible to get rid of it completely. It's merely based on tweaking observation to get the results I want. Now, with a simple custom specular shader that uses the texture's alpha channel to control what should be affected by specular highlights and what shouldn't, I was able to get the "custom" results in the screenshots above and below.
The last thing I did was to write a custom fog version that also sampled the same exact diffuse term in order to have the fog be colored by the ambient lighting as well. This created a colored fog that blends much better with the environment around it.
I'm attaching the BRDF + specular shader for those that need it.