I mentioned the environment lighting. I thought about implementing a cubemap system (like seen here: http://www.codinglabs.net/article_physically_based_rendering_cook_torrance.aspx) but I don’t think it is worth the trouble for my small project.
Since I like writing, I’ll write some stuff down for people who are not familiar with the systems, but are curious.
What is the issue: The world looks fairly bland for all the things not facing the sun.
Usually it’s black, but the approach often times is to apply an ambient term to everything:
outputColor = DiffuseLight * inputColor + Ambient * inputColor + Specular;
That means that we still see the texture of the object when it’s in the shadows but that’s it, no variation at all. If our object has no texture it is simple one color for the whole thing. Not great.
With the introduction of PBR to all major game engines engineers started to ditch the ambient term all along, because it’s not “physically plausible” to have an ambient term.
So.. what happens to the things in the shadow? Well with some advanced global illumination system we can simulate the light bouncing around until it hits the object eventually. Or not. Then it’s simply black, like it should.
However, most engines have that GI either precomputed or not at all, so they simulate the lighting conditions by creating cubemaps around the world which are then applied to the object, in order to simulate them being affected by environment light. This is called “Image Based Lighting” (IBL) and is very relevant in all major game engines.
A cubemap is basically the world “panorama” displayed on a texture. This “panorama” can be created by the camera panning around 90° and making a screenshot of the current world. But then we also need the +Z and –Z direction and that’s how the “cube” is created.
Well and then we need to create a system where we check the normals of our objects and read in the cubemap where they are pointing, and voila, that’s the color we apply to them.
However, in my case this cubemap would only really need +Z and –Z since the game takes place in a desert. So the objects are illuminated from the bottom (where the sunlight reflects off of the orange ground) and the top (where skylight comes in).
So the easiest solution was just to set up two more NdotLs (basic diffuse lighting computation) in my Pixel Shader.
float NdotSky = saturate(dot(input.Normal, float3(.1, 0, 1),));
float4 indirectDiffuse = NdotSky * SkyColorIntensity * SkyColor;
and apply that to the final color
float4 finalValue = shadowAmount * gamma_color * diffuse
+ gamma_color * indirectDiffuse
+ gamma_color * indirectDiffuse2
+ shadowAmount * specular;
easy. And very cheap.
However, there is still a small problem:
If the normal is more or less perpendicular it’s simply black. (Which makes sense)
Soo let’s make one of the lights “overreach”.
float indirectGround = saturate(dot(float3(.1, 0, -1), input.Normal) + 0.5);
I expand the ground light to cover more normals.
For people not familiar with the math: A dot product of two normalized vectors basically returns how much they face in the same direction. So if the dot product gives us one they face the same direction, if it gives us 0 they are perpendicular, if it gives us –1 the are opposite. So I just increase the output of the dot product by 0.5. So it’s basically 45° more light coverage.
The saturate term in HLSL just clamps my values to 0 and 1.
Great :) Good enough for my project (and maybe yours too)