Presentation is loading. Please wait.

Presentation is loading. Please wait.

Chapter XIV Normal Mapping

Similar presentations


Presentation on theme: "Chapter XIV Normal Mapping"— Presentation transcript:

1 Chapter XIV Normal Mapping

2 Bumpy Surfaces Rendering with a high-frequency polygon mesh
Unfortunately, it is not cheap to process the high-resolution mesh.

3 Bumpy Surfaces (cont’d)
Rendering with a low-resolution mesh It is cheap to process such a low-resolution mesh. However, the quad does not properly expose the bumpy features.

4 Normal Mapping A way out of this dilemma is
to pre-compute and store the normals of the high-frequency surface into a special texture named normal map, and to use a lower-resolution mesh at run time which we call base surface and fetch the normals from the normal map for lighting. normal map normal-mapped base surface

5 Height Field A popular method to represent a high-frequency surface is to use a height field. It is a function h(x,y) that returns a height or z value given (x,y) coordinates. The height field is sampled with a 2D array of regularly spaced (x,y) coordinates, and the height values are stored in a texture named height map. The height map can be drawn in gray scales. If the height is in the range of [0,255], the lowest height 0 is colored in black, and the highest 255 is colored in white.

6 Normal Map Simple image-editing operations can create a gray-scale image (height map) from an image texture (from (a) to (b)). The next step from (b) to (c) is done automatically.

7 Normal Map (cont’d) Creation of a normal map from a height map
Visualization of a normal map

8 Normal Mapping (cont’d)
The polygon mesh is rasterized and texture coordinates (s,t) are used to access the normal map. The normal at (s,t) is obtained by filtering the normal map. Consider the diffuse reflection term, max(n•l,0)sdmd. The normal n is fetched from the normal map. md is fetched from the image texture.

9 Normal Mapping (cont’d)
Vertex shader

10 Normal Mapping (cont’d)
Fragment shader

11 Normal Mapping (cont’d)
Fragment shader

12 Normal Mapping (cont’d)

13 Tangent-space Normal Mapping
Recall that texturing is described as wrapping a texture onto an object surface. We should be able to paste it to various surfaces. For this, we need an additional process, which is not performed during image texturing.

14 Tangent-space Normal Mapping (cont’d)
For a surface point, consider a tangent space that is defined by three orthonormal vectors: T (for tangent) B (for bitangent) N (for normal) The tangent spaces vary across the object surface. The normal fetched from the normal map using p’s texture coordinates, (sp,tp), is denoted as n(sp,tp). Without normal mapping, Np would be used for lighting. In normal mapping, however, n(sp,tp) replaces Np, which is (0,0,1) in the tangent space of p. This implies that, as a ‘perturbed’ instance of (0,0,1), n(sp,tp) can be considered to be defined in the tangent space of p. Whatever surface point is normal-mapped, the normal fetched from the normal map is considered to be defined in the tangent space of that point.

15 Tangent Space Consider dot(normal, light) in p.11.
It does not work in general because normal is a tangent-space vector but light is a world-space vector. Two vectors defined in different spaces may not be combined through dot product. In the example, dot(normal, light) worked because the basis of the world space happened to be identical to that of the tangent space for every point of the base surface, the quad.

16 Tangent Space (cont’d)
The basis of the tangent space {T,B,N} Vertex normal N – defined per vertex at the modeling stage. Tangent T – needs to be computed Bitangent B – needs to be computed There are many utility functions that compute T and B on each vertex of a polygon mesh.

17 Tangent-space Normal Mapping (cont’d)
Consider the diffuse term of the Phong lighting model A light source is defined in the world space, and so is l. In contrast, n fetched from the normal map is defined in the tangent space. To resolve this inconsistency, n has to be transformed into the world space, or l has to be transformed into the tangent space. Typically, the per-vertex TBN-basis is pre-computed, is stored in the vertex array and is passed to the vertex shader. The vertex shader first transforms T, B, and N into the world space and then constructs a matrix that rotates the world-space light vector into the per-vertex tangent space.

18 Tangent-space Normal Mapping (cont’d)
Observe that normal is used for defining the transform to the tangent space and is not output.

19 Tangent-space Normal Mapping (cont’d)

20 Tangent-space Normal Mapping (cont’d)

21 Authoring Normal Maps Sculpting using Zbrush
The original low-resolution model in (a) is taken as the base surface, and the high-frequency model in (e) is taken as the reference surface.

22 Authoring Normal Maps (cont’d)
Normal map creation (in Zbrush)

23 Authoring Normal Maps (cont’d)
Normal map creation (in Zbrush)


Download ppt "Chapter XIV Normal Mapping"

Similar presentations


Ads by Google