If you’d like to render an object with fine
surface detail but you don’t want to use an incredibly dense mesh of
triangles, there’s a technique called *bump mapping*
that fits the bill. It’s also called *normal
mapping*, since it works by varying the surface normals to
affect the lighting. You can use this technique to create more than mere
bumps; grooves or other patterns can be etched into (or raised from) a
surface. Remember, a good graphics programmer thinks like a politician and
uses lies to her advantage! Normal mapping doesn’t actually affect the
geometry at all. This is apparent when you look along the silhouette of a
normal-mapped object; it appears flat. See Figure 8-2.

You can achieve this effect with either OpenGL ES 2.0 or OpenGL ES 1.1, although bump mapping under 1.1 is much more limited.

Either approach requires the use of a
*normal map*, which is a texture that contains normal
vectors (XYZ components) rather than colors (RGB components). Since color
components are, by definition, non-negative, a conversion needs to occur
to represent a vector as a color:

vec3 TransformedVector = (OriginalVector + vec3(1, 1, 1)) / 2

The previous transformation simply changes the range of each component from [–1, +1] to [0, +1].

Representing vectors as colors can sometimes cause problems because of relatively poor precision in the texture format. On some platforms, you can work around this with a high-precision texture format. At the time of this writing, the iPhone does not support high-precision formats, but I find that standard 8-bit precision is good enough in most scenarios.

Another way to achieve bump mapping with shaders is to cast aside the normal map and opt for a procedural approach. This means doing some fancy math in your shader. While procedural bump mapping is fine for simple patterns, it precludes artist-generated content.

There are a number of ways to generate a normal
map. Often an artist will create a *height map*,
which is a grayscale image where intensity represents surface
displacement. The height map is then fed into a tool that builds a terrain
from which the surface normals can be extracted (conceptually
speaking).

`PVRTexTool`

(see The PowerVR SDK and Low-Precision Textures) is such a tool. If you invoke
it from a terminal window, simply add `-b`

to the command
line, and it generates a normal map. Other popular tools include Ryan
Clark’s crazybump application and NVIDIA’s
Melody, but neither of these is supported on
Mac OS X at the time of this writing. For professional
artists, Pixologic’s Z-Brush is probably the most
sought-after tool for normal map creation (and yes, it’s Mac-friendly).
For an example of a height map and its resulting normal map, see the left
two panels in Figure 8-3.

Figure 8-3. Left two panels: height map and tangent-space normals; right panel: object-space normals for the Klein bottle

An important factor to consider with normal maps is the “space” that they live in. Here’s a brief recap from Chapter 2 concerning the early life of a vertex:

For bog standard lighting (not bump mapped),
normals are sent to OpenGL in object space. However, the normal maps that
get generated by tools like crazybump are defined in *tangent
space* (also known as *surface local
space*). Tangent space is the 2D universe that textures live
in; if you were to somehow “unfold” your object and lay it flat on a
table, you’d see what tangent space looks like.

Another tidbit to remember from an earlier chapter is that OpenGL takes object-space normals and transforms them into eye space using the inverse-transpose of the model-view matrix (Normal Transforms Aren’t Normal). Here’s the kicker: transformation of the normal vector can actually be skipped in certain circumstances. If your light source is infinitely distant, you can simply perform the lighting in object space! Sure, the lighting is a bit less realistic, but when has that stopped us?

So, normal maps are (normally) defined in tangent space, but lighting is (normally) performed in eye space or object space. How do we handle this discrepancy? With OpenGL ES 2.0, we can revise the lighting shader so that it transforms the normals from tangent space to object space. With OpenGL ES 1.1, we’ll need to transform the normal map itself, as depicted in the rightmost panel in Figure 8-3. More on this later; first we’ll go over the shader-based approach since it can give you a better understanding of what’s going on.

Before writing any code, we need to figure out how the shader should go about transforming the normals from tangent space to object space. In general, we’ve been using matrices to make transformations like this. How can we come up with the right magical matrix?

Any coordinate system can be defined with a
set of *basis vectors*. The set is often simply
called a *basis*. The formal definition of
*basis* involves phrases like “linearly independent
spanning set,” but I don’t want you to run away in abject terror, so
I’ll just give an example.

For 3D space, we need three basis vectors,
one for each axis. The *standard basis* is the
space that defines the Cartesian coordinate system that we all know and
love:

Any set of unit-length vectors that are all
perpendicular to each other is said to be
*orthonormal*. Turns out that there’s an elegant
way to transform a vector from any orthonormal basis to the standard
basis. All you need to do is create a matrix by filling in each row with
a basis vector:

If you prefer column-vector notation, then the basis vectors form columns rather than rows:

In any case, we now have the magic matrix for transforming normals! Incidentally, basis vectors can also be used to derive a matrix for general rotation around an arbitrary axis. Basis vectors are so foundational to linear algebra that mathematicians are undoubtedly scoffing at me for not covering them much earlier. I wanted to wait until a practical application cropped up—which brings us back to bump mapping.

So, our bump mapping shader will need three
basis vectors to transform the normal map’s values from tangent space to
object space. Where can we get these three basis vectors? Recall for a
moment the `ParametricSurface`

class that was
introduced early in this book. In The Math Behind Normals, the
following pseudocode was presented:

p = Evaluate(s, t) u = Evaluate(s + ds, t) - p v = Evaluate(s, t + dt) - p n = Normalize(u × v)

The three vectors **u**, **v**, and
**n** are all perpendicular to each
other—perfect for forming an orthonormal basis! The
`ParametricSurface`

class already computes **n** for us, so all we need to do is amend it to
write out one of the tangent vectors. Either **u** or **v** will work
fine; there’s no need to send both because the shader can easily compute
the third basis vector using a cross product. Take a look at Example 8-1; for a baseline, this uses the
parametric surface code that was first introduced in Chapter 3 and enhanced in subsequent chapters. New lines are
highlighted in bold.

Example 8-1. Tangent support in ParametricSurface.hpp

void ParametricSurface::GenerateVertices(vector<float>& vertices, unsigned char flags) const { int floatsPerVertex = 3; if (flags & VertexFlagsNormals) floatsPerVertex += 3; if (flags & VertexFlagsTexCoords) floatsPerVertex += 2;if (flags & VertexFlagsTangents)floatsPerVertex += 3;vertices.resize(GetVertexCount() * floatsPerVertex); float* attribute = &vertices[0]; for (int j = 0; j < m_divisions.y; j++) { for (int i = 0; i < m_divisions.x; i++) { // Compute Position vec2 domain = ComputeDomain(i, j); vec3 range = Evaluate(domain); attribute = range.Write(attribute); // Compute Normal if (flags & VertexFlagsNormals) { ... } // Compute Texture Coordinates if (flags & VertexFlagsTexCoords) { ... }// Compute Tangentif (flags & VertexFlagsTangents) {float s = i, t = j;vec3 p = Evaluate(ComputeDomain(s, t));vec3 u = Evaluate(ComputeDomain(s + 0.01f, t)) - p;if (InvertNormal(domain))u = -u;attribute = u.Write(attribute);}} } }

Let’s crack some knuckles and write some shaders. A good starting point is the pair of shaders we used for pixel-based lighting in Chapter 4. I’ve repeated them here (Example 8-2), with uniform declarations omitted for brevity.

Example 8-2. Per-pixel lighting vertex and fragment shaders

attribute vec4 Position; attribute vec3 Normal; varying mediump vec3 EyespaceNormal; // Vertex Shader void main(void) { EyespaceNormal = NormalMatrix * Normal; gl_Position = Projection * Modelview * Position; } // Fragment Shader void main(void) { highp vec3 N = normalize(EyespaceNormal); highp vec3 L = LightVector; highp vec3 E = EyeVector; highp vec3 H = normalize(L + E); highp float df = max(0.0, dot(N, L)); highp float sf = max(0.0, dot(N, H)); sf = pow(sf, Shininess); lowp vec3 color = AmbientMaterial + df * DiffuseMaterial + sf * SpecularMaterial; gl_FragColor = vec4(color, 1); }

To extend this to support bump mapping, we’ll need to add new attributes for the tangent vector and texture coordinates. The vertex shader doesn’t need to transform them; we can leave that up to the pixel shader. See Example 8-3.

Example 8-3. Vertex shader for the Bumpy sample

attribute vec4 Position; attribute vec3 Normal; attribute vec3 Tangent; attribute vec2 TextureCoordIn; uniform mat4 Projection; uniform mat4 Modelview; varying vec2 TextureCoord; varying vec3 ObjectSpaceNormal; varying vec3 ObjectSpaceTangent; void main(void) { ObjectSpaceNormal = Normal; ObjectSpaceTangent = Tangent; gl_Position = Projection * Modelview * Position; TextureCoord = TextureCoordIn; }

Before diving into the fragment shader, let’s review what we’ll be doing:

Extract a perturbed normal from the normal map, transforming it from [0, +1] to [–1, +1].

Create three basis vectors using the normal and tangent vectors that were passed in from the vertex shader.

Perform a change of basis on the perturbed normal to bring it to object space.

Execute the same lighting algorithm that we’ve used in the past, but use the perturbed normal.

Now we’re ready! See Example 8-4.

When computing
`tangentSpaceNormal`

, you might need to swap the
normal map’s `x`

and `y`

components,
just like we did in Example 8-4. This may or may not be
necessary, depending on the coordinate system used by your normal map
generation tool.

Example 8-4. Fragment shader for the Bumpy sample

varying mediump vec2 TextureCoord; varying mediump vec3 ObjectSpaceNormal; varying mediump vec3 ObjectSpaceTangent; uniform highp vec3 AmbientMaterial; uniform highp vec3 DiffuseMaterial; uniform highp vec3 SpecularMaterial; uniform highp float Shininess; uniform highp vec3 LightVector; uniform highp vec3 EyeVector; uniform sampler2D Sampler; void main(void) { // Extract the perturbed normal from the texture: highp vec3 tangentSpaceNormal = texture2D(Sampler, TextureCoord).yxz * 2.0 - 1.0; // Create a set of basis vectors: highp vec3 n = normalize(ObjectSpaceNormal); highp vec3 t = normalize(ObjectSpaceTangent); highp vec3 b = normalize(cross(n, t)); // Change the perturbed normal from tangent space to object space: highp mat3 basis = mat3(n, t, b); highp vec3 N = basis * tangentSpaceNormal; // Perform standard lighting math: highp vec3 L = LightVector; highp vec3 E = EyeVector; highp vec3 H = normalize(L + E); highp float df = max(0.0, dot(N, L)); highp float sf = max(0.0, dot(N, H)); sf = pow(sf, Shininess); lowp vec3 color = AmbientMaterial + df * DiffuseMaterial + sf * SpecularMaterial; gl_FragColor = vec4(color, 1); }

We’re not done just yet, though—since the
lighting math operates on a normal vector that lives in object space,
the `LightVector`

and `EyeVector`

uniforms that we pass in from the application need to be in object space
too. To transform them from world space to object space, we can simply
multiply them by the model matrix using our C++ vector library. Take
care not to confuse the model matrix with the model-view matrix; see
Example 8-5.

Example 8-5. Render() method for the Bumpy sample (OpenGL ES 2.0)

void RenderingEngine::Render(float theta) const { // Render the background image: ... const float distance = 10; const vec3 target(0, 0, 0); const vec3 up(0, 1, 0); const vec3 eye = vec3(0, 0, distance); const vec3 view = mat4::LookAt(eye, target, up); const mat4 model = mat4::RotateY(theta); const mat4 modelview = model * view; const vec4 lightWorldSpace = vec4(0, 0, 1, 1); const vec4 lightObjectSpace = model * lightWorldSpace; const vec4 eyeWorldSpace(0, 0, 1, 1); const vec4 eyeObjectSpace = model * eyeWorldSpace; glUseProgram(m_bump.Program); glUniform3fv(m_bump.Uniforms.LightVector, 1, lightObjectSpace.Pointer()); glUniform3fv(m_bump.Uniforms.EyeVector, 1, eyeObjectSpace.Pointer()); glUniformMatrix4fv(m_bump.Uniforms.Modelview, 1, 0, modelview.Pointer()); glBindTexture(GL_TEXTURE_2D, m_textures.TangentSpaceNormals); // Render the Klein bottle: ... }

You might be wondering why we used
object-space lighting for shader-based bump mapping, rather than
eye-space lighting. After all, eye-space lighting is what was presented
way back in Chapter 4 as the “standard” approach.
It’s actually fine to perform bump map lighting in eye space, but I
wanted to segue to the fixed-function approach, which
*does* require object space!

Another potential benefit to lighting in object space is performance. I’ll discuss this more in the next chapter.

Earlier in the chapter, I briefly mentioned that OpenGL ES 1.1 requires the normal map itself to be transformed to object space (depicted in the far-right panel in Figure 8-3). If it were transformed it to eye space, then we’d have to create a brand new normal map every time the camera moves. Not exactly practical!

The secret to bump mapping with
fixed-function hardware lies in a special texture combiner operation
called `GL_DOT3_RGB`

. This technique is often simply
known as *DOT3 lighting*. The basic idea is to have
the texture combiner generate a gray color whose intensity is determined
by the dot product of its two operands. This is sufficient for simple
diffuse lighting, although it can’t produce specular highlights. See
Figure 8-4 for a screenshot of the Bumpy app with
OpenGL ES 1.1.

Here’s the sequence of
`glTexEnv`

calls that sets up the texturing state used
to generate Figure 8-4:

glTexEnvi(GL_TEXTURE_ENV, GL_TEXTURE_ENV_MODE, GL_COMBINE); glTexEnvi(GL_TEXTURE_ENV, GL_COMBINE_RGB, GL_DOT3_RGB); glTexEnvi(GL_TEXTURE_ENV, GL_SRC0_RGB, GL_PRIMARY_COLOR); glTexEnvi(GL_TEXTURE_ENV, GL_OPERAND0_RGB, GL_SRC_COLOR); glTexEnvi(GL_TEXTURE_ENV, GL_SRC1_RGB, GL_TEXTURE); glTexEnvi(GL_TEXTURE_ENV, GL_OPERAND1_RGB, GL_SRC_COLOR);

The previous code snippet tells OpenGL to set up an equation like this:

Curious about the **H** offset and the final multiply-by-four?
Remember, we had to transform our normal vectors from unit space to
color space:

vec3 TransformedVector = (OriginalVector + vec3(1, 1, 1)) / 2

The **H** offset
and multiply-by-four simply puts the final result back into unit space.
Since this assumes that *both* vectors have been
transformed in the previous manner, take care to transform the light
position. Here’s the relevant snippet of application code, once again
leveraging our C++ vector library:

vec4 lightWorldSpace = vec4(0, 0, 1, 0); vec4 lightObjectSpace = modelMatrix * lightWorldSpace; lightObjectSpace = (lightObjectSpace + vec4(1, 1, 1, 0)) * 0.5f; glColor4f(lightObjectSpace.x, lightObjectSpace.y, lightObjectSpace.z, 1);

The result from DOT3 lighting is often modulated with a second texture stage to produce a final color that’s nongray. Note that DOT3 lighting is basically performing per-pixel lighting but without the use of shaders!

Perhaps the most awkward aspect of DOT3 lighting is that it requires you to somehow create a normal map in object space. Some generator tools don’t know what your actual geometry looks like; these tools take only a simple heightfield for input, so they can generate the normals only in tangent space.

The trick I used for the Klein bottle was to use OpenGL ES 2.0 as part of my “art pipeline,” even though the final application used only OpenGL ES 1.1. By running a modified version of the OpenGL ES 2.0 demo and taking a screenshot, I obtained an object-space normal map for the Klein bottle. See Figure 8-5.

Examples 8-6 and 8-7 show the shaders for this. Note that
the vertex shader ignores the model-view matrix and the incoming vertex
position. It instead uses the incoming texture coordinate to determine
the final vertex position. This effectively “unfolds” the object. The
`Distance`

, `Scale`

, and
`Offset`

constants are used to center the image on the
screen. (I also had to do some cropping and scaling on the final image
to make it have power-of-two dimensions.)

Example 8-6. Vertex shader for the object-space generator

attribute vec3 Normal; attribute vec3 Tangent; attribute vec2 TextureCoordIn; uniform mat4 Projection; varying vec2 TextureCoord; varying vec3 ObjectSpaceNormal; varying vec3 ObjectSpaceTangent; const float Distance = 10.0; const vec2 Offset = vec2(0.5, 0.5); const vec2 Scale = vec2(2.0, 4.0); void main(void) { ObjectSpaceNormal = Normal; ObjectSpaceTangent = Tangent; vec4 v = vec4(TextureCoordIn - Offset, -Distance, 1); gl_Position = Projection * v; gl_Position.xy *= Scale; TextureCoord = TextureCoordIn; }

The fragment shader is essentially the same as what was presented in Normal Mapping with OpenGL ES 2.0, but without the lighting math.

Example 8-7. Fragment shader for the object-space generator

varying mediump vec2 TextureCoord; varying mediump vec3 ObjectSpaceNormal; varying mediump vec3 ObjectSpaceTangent; uniform sampler2D Sampler; void main(void) { // Extract the perturbed normal from the texture: highp vec3 tangentSpaceNormal = texture2D(Sampler, TextureCoord).yxz * 2.0 - 1.0; // Create a set of basis vectors: highp vec3 n = normalize(ObjectSpaceNormal); highp vec3 t = normalize(ObjectSpaceTangent); highp vec3 b = normalize(cross(n, t)); // Change the perturbed normal from tangent space to object space: highp mat3 basis = mat3(n, t, b); highp vec3 N = basis * tangentSpaceNormal; // Transform the normal from unit space to color space: gl_FragColor = vec4((N + 1.0) * 0.5, 1); }

Start Free Trial

No credit card required