Bump Mapping -1 Three scales of detail on an object Macro-features that cover many pixels Represented by vertices and triangles Micro-features that are substantially smaller than a pixel Encapsulated in the shading (lighting) model Simulates the light interaction of a surface’s microscopic geometry; e.g., shiny objects are microscopically smooth, and diffuse surfaces are microscopically rough Meso-features that are a few pixels across Describe everything between macro- and micro scales Contains detail that is too complex to render using polygons, but that is large enough for viewers to see changes in curvature over a few pixels Winkles on faces Graphics –U, Chap 4 Texture Mapping
Bump Mapping -2 A large family of small-scale detail representation techniques By modifying the per-pixel shading routine Give a more 3D appearance than texture mapping, but less than actual geometry Bump mapping A family of methods that are used for mesoscale modeling Adjust shading parameters at the pixel level in such a way that the viewer perceives small perturbation away from the base geometry, which actually remains flat Blinn’s methods, normal mapping, Parallax mapping, Relief mapping Graphics –U, Chap 4 Texture Mapping
Bump Mapping -3 Graphics –U, Chap 4 Texture Mapping
Bump Mapping -4 (height field values) Graphics –U, Chap 4 Texture Mapping
Bump Mapping -5 Blinn’s methods Through variations in surface normal Surface normal is perturbed according to a 2D bump map. This tricks a local reflection model into producing appearance of bumpy details on a smooth surface The geometry does not change Bump map Two signed value (bu, bv) Used to scale two vectors that are perpendicular to normal A heightfield Used to derive (bu, bv) Graphics –U, Chap 4 Texture Mapping
Bump Mapping -6 Bump map of two signed values (bu, bv) (height field values) Bump map of two signed values (bu, bv) Bump map of highfield values Graphics –U, Chap 4 Texture Mapping
Bump Mapping -7 (height field values) Graphics –U, Chap 4 Texture Mapping
Bump Mapping -8 Let P(u, v) be a parametric surface Perturb the normal N to N’ N’ is the normal of offset surface P’ The displacement must lie on the tangent plane of P Two arrays that contain partials of B(u,v) can be precomputed. Last terms can be neglected if B(u,v) is small. Graphics –U, Chap 4 Texture Mapping
Bump Mapping -9 Graphics –U, Chap 4 Texture Mapping
Bump Mapping -10 Graphics –U, Chap 4 Texture Mapping
Bump Mapping -11 on polygonal mesh Pu x Pv is not normalized. Pu x Pv changes at different rate than D – not a desirable feature A full implementation of the equation in a rasterizer is impractical So computation is divided among a preprocessing, per-vertex, and per-pixel calculation Compute Pu x Pv, Pv x N, N x Pu at vertices and interpolate them to polygon interiors. (with possible hardware support) For each pixel inside the triangle the perturbed normal vector is computed and normalized, with Bu and Bv read from a texture map Compute the per-pixel lighting must be in tangent space to get stable bump mapping for animated or deformed objects!! Graphics –U, Chap 4 Texture Mapping
Bump Mapping -12 on polygonal mesh Emboss bump mapping Borrowed from image embossing One of the first methods for real-time bump mapping A cheap bump mapping implementation, but not really bump mapping Uses diffuse lighting only and has no specular effect Under-sampling artifacts Possible on today’s hardware If it looks ok, do it! Real bump mapping uses per-pixel lighting based on the perturbed normal vectors. Graphics –U, Chap 4 Texture Mapping
Bump Mapping -13 on polygonal mesh Dot product (Dot 3) bump mapping The primary method for modern graphics hardware Bump map texture stores the actual normal N’ to be used for bump mapping The normal defined in pixel’s tangent space Per-fragment lighting computed in pixel’s tangent space: N’ L (or diffuse) and (N’ H)^8 (for specular) Graphics –U, Chap 4 Texture Mapping
Dot3 bump map -1 Normal map Bump map image Graphics –U, Chap 4 Texture Mapping
Dot3 bump map -2 For a light source The light source’s location is transformed to the surface’s tangent space basis at each triangle’s vertex Interpolate among the un-normalized light vectors and normalize the resulting vector, which is converted to 8-bits values in the range [-1, 1] By indexing to a specially build cubic environment map using the vector direction, any vector can be quickly renormalized. The cubic environment map stores normalized vectors, as 8-bit signed values, for each direction The normal from the normal map is then combined with the interpolated light vector at each pixel using dot product Performed by a special texture blending function provided precisely for this purpose This step obtains diffuse lighting Graphics –U, Chap 4 Texture Mapping
Dot3 bump map -3 References (available in NVIDIA developer) Specular lighting can also be easily derived Instead of using the vector to the light, interpolate the half angle vector H H=(L+v)/|L+v|, and dot(r, v) = dot(n, H) Using pixel shader, the dot (n, H) specular term can be raised to the 8th power References (available in NVIDIA developer) A Practical and Robust Bump-mapping Technique for Today’s GPUs by Mark J. Kilgard, NVIDIA Corporation Hardware bump mapping by D. Sim, in Game Programming Gems Graphics –U, Chap 4 Texture Mapping
Dot3 bump map -4 Given a mesh and a normal map For each polygon The light source’s location is transformed to the surface’s tangent space basis at each vertex For each pixel Interpolated the un-normalized light vector, and normal it using the cubic environment map Dot product the light vector and the normal at the normal map, i.e., dot(n, L), using texture-blending. Interpolated the un-normalized half angle vector, and normalize it using the cubic environment map Dot product the normal at the normal map and the half angle vector, i.e., dot(n, H), using texture-blending, and then raised to the power Graphics –U, Chap 4 Texture Mapping
Dot3 bump map -5 Vector normalization cube map Performs per-pixel normalization Denormalized vector (s, t, r) Use the component of the largest magnitude to determine the pierced cube map face +x, -x, +y, -y, +z, -z The remaining two components are divided by the component of the largest magnitude This effectively projects these two components onto the selected face The two projected components are scaled and biased to the [0, 1] texel range, depending on the selected face to compute a 2D texture coordinate. Graphics –U, Chap 4 Texture Mapping
Dot3 bump map -6 Vector normalization cube map Graphics –U, Chap 4 Texture Mapping
Dot3 bump map -7 [Mark J. Kilgard] Result Height field texture Color texture modulated by the diffuse contribution computed using a normal bump map Specular contribution computed using a normalized version of normal map Result Height field texture [Mark J. Kilgard] Graphics –U, Chap 4 Texture Mapping
Dot3 bump map -8 Dot3 Bump Mapping Height field texture Graphics –U, Chap 4 Texture Mapping
Dot3 bump map -9 Original texture GLBump2 demo!! Bump map Graphics –U, Chap 4 Texture Mapping
Dot3 bump map -10 Bump map + shadowing md2shader demo!! Graphics –U, Chap 4 Texture Mapping
Dot3 bump map -11 Bump map + shadowing md2shader demo!! [Crysis on line game] Graphics –U, Chap 4 Texture Mapping
Dot3 bump map -12 Graphics –U, Chap 4 Texture Mapping Gouraud shading Phong shading Phong shading with Bump map Graphics –U, Chap 4 Texture Mapping
Dot3 bump map -13 A wavy height field bump map Bump mapping effect Gouraud shading Phong shading Phong shading with Bump map A wavy height field bump map Bump mapping effect Graphics –U, Chap 4 Texture Mapping
Displacement mapping Sample the height field to find the amount of displacement, and then displace each vertex along its normal. Adds surface detail, while resulting correct silhouette and no parallax errors, and also can create model itself Modeling Example: Terrain = a flat plane + a displacement map Base surfaces: mesh, subdivision surfaces Reference Displacement mapping on the GPU – State of art, by Szirmay-Kalos and Umenhoffer, CGF, Vol. 25, No. 3, 2006 Displacement Mapping, by Michael Doggett, ATI Research GDC 2003 Displacement Mapping, by Tom Forsyth See GDC2003-DisplacementMappingNotes.pdf from NVIDIA Developer Graphics –U, Chap 4 Texture Mapping
Displacement mapping Displacement map: Store scalar displacement Graphics –U, Chap 4 Texture Mapping
Displacement mapping Example Graphics –U, Chap 4 Texture Mapping
Displacement mapping Example Graphics –U, Chap 4 Texture Mapping
Displacement mapping Base surface and maps Low-poly “base” mesh Derived using polygon simplification 2D-to-3D mapping Mesh parameterization Maps Height-field scalar displacement map Noraml map Most tools apply to both normal and displacement maps Graphics –U, Chap 4 Texture Mapping
Displacement mapping Deriving maps Ray casting based method A ray is cast from that position along the interpolated normal, and the intersection with high-polygon mesh is found The normal of the high-polygon mesh is written to normal map The distance along the ray to the intersection point is written to the displacement map Based on a high-polygon mesh and a parameterization Derive the displacement and normal maps for low-polygon base meshes Each texel on the map has a single position on the low-polygon mesh Graphics –U, Chap 4 Texture Mapping
Displacement mapping Rendering Take sample points and displace them perpendicularly to the macrostructure surface with the distance obtained from the map. Per-vertex displacement map Sample points can be vertices of the tessellated mesh GPU per-vertex displacement mapping using vertex shader or geometry shader Displaced geometry goes through rendering pipeline Per-pixel displacement map (inverse mapping) Sample points corresponding to the texel centers Surface details are added when color texturing takes place. GPU-based: ray tracing using fragment shader [from 1998] Graphics –U, Chap 4 Texture Mapping
Per-vertex displacement mapping [DP on the GPU: State of art] Graphics –U, Chap 4 Texture Mapping
Per-vertex displacement mapping [Moule/McCool: Efficient bounded adaptive tessellation of displacement maps] Graphics –U, Chap 4 Texture Mapping
Per-vertex displacement mapping [DP on the GPU: State of art] [Moule/McCool: Efficient bounded adaptive tessellation of displacement maps] Graphics –U, Chap 4 Texture Mapping
Per-vertex displacement mapping Problems The number of vertices can be very high <-> the aim of DM GPUs have usually more pixel-processing power than vertex-processing power Pixel shaders are better equipped to access textures. Texture access in vertex shader is slower than in the fragment shader. Vertex shader always executes once for each vertex in the model, even to invisible or hardly visible parts Graphics –U, Chap 4 Texture Mapping
Per-pixel displacement mapping Vertex shader process only the base mesh Surface height map is taken into account when fragments are processed Use ray tracing to resolve the visibility problem Ray tracing into height field to obtain the texture coordinate of the visible point, which are used to fetch color and normal vector. For each processed point (u, v, 0) in tangent space, fragment shader finds the point of the height field which is really seen by the ray connecting (u, v, 0) and the pixel center. Ray direction: tangent space view vector Graphics –U, Chap 4 Texture Mapping
Per-pixel displacement mapping [DP on the GPU: State of art] Graphics –U, Chap 4 Texture Mapping
Per-pixel displacement mapping [DP on the GPU: State of art] Graphics –U, Chap 4 Texture Mapping
Multitexturing -1 Two or more textures are applied in a single rendering pass Hardware supported to render in a single pass Multiple textures can be fetched when processing a single fragment Draw object once with several textures, some of which can be dynamically modified. E.g., The texture map may remain unchanged from frame to frame while the light map may be updated by a dynamic light and the fog map may be changed by the moving camera The texture coordinate and vertex correspondence can be different for each texture map Reference See Real Time Rendering, 2nd ed., p146 See JL Mitchell’s page at http://www.pixelmaven.com/jason/ Graphics –U, Chap 4 Texture Mapping
Multitexturing -2 Multitexturing can Save rendering passes Allows more complex shading models than does the application of a single texture per pass Example: compute a lighting model with expression AB+CD, where each variable represents a different color texture’s value Impossible to evaluate without multitexturing or using off-screen rendering (stores images that can be combined later) A multipass rendering algorithm could combine AB in two passes and add C in the next, but it could not fold in D because C would already have been added to AB. There is no place to keep C separate from AB, since only one color can be stored in the frame buffer. Graphics –U, Chap 4 Texture Mapping
Multitexturing -3 Combine the results of these texture accesses, a texture blending cascade is defined that is made up of a series of texture stages. Graphics –U, Chap 4 Texture Mapping
Multitexturing -3 Two maps: a color map and a light map Texture map Graphics –U, Chap 4 Texture Mapping
3D Texture Mapping - 1 Texture is determined by the intersection of the surface with the predefined 3D texture field. Texture field is obtained by procedure generation. Vertex coordinates are used to index a procedure (e.g. 3D noise function) that defines the 3D texture field for that point. Advantages is successful at simulating turbulence (e.g., marble objects). eliminates mapping problems. objects of arbitrary complexity can receive a texture in a coherent fashion. Graphics –U, Chap 4 Texture Mapping
3D Texture Mapping - 2 3D texture mapping in object space Graphics –U, Chap 4 Texture Mapping
3D Texture Mapping - 3 3D mapping and 3D noise Graphics –U, Chap 4 Texture Mapping
3D Texture Mapping - 4 Graphics –U, Chap 4 Texture Mapping