Bump Mapping -1 Three scales of detail on an object

Slides:



Advertisements
Similar presentations
16.1 Si23_03 SI23 Introduction to Computer Graphics Lecture 16 – Some Special Rendering Effects.
Advertisements

Michael I. Gold NVIDIA Corporation
Exploration of bump, parallax, relief and displacement mapping
Graphics Pipeline.
03/16/2009Dinesh Manocha, COMP770 Texturing Surface’s texture: its look & feel Graphics: a process that takes a surface and modifies its appearance using.
Week 7 - Monday.  What did we talk about last time?  Specular shading  Aliasing and antialiasing.
3D Graphics Rendering and Terrain Modeling
Real-Time Rendering TEXTURING Lecture 02 Marina Gavrilova.
Computer Graphics (Fall 2005) COMS 4160, Lecture 16: Illumination and Shading 1
(conventional Cartesian reference system)
X86 and 3D graphics. Quick Intro to 3D Graphics Glossary: –Vertex – point in 3D space –Triangle – 3 connected vertices –Object – list of triangles that.
Texture Mapping from Watt, Ch. 8 Jonathan Han. Topics Discussed Texture Map to Models Bump Maps, Light Maps Environment (Reflection) Mapping 3D Textures.
Computer Graphics Inf4/MSc Computer Graphics Lecture 11 Texture Mapping.
Shadows Computer Graphics. Shadows Shadows Extended light sources produce penumbras In real-time, we only use point light sources –Extended light sources.
Week 8 - Monday.  What did we talk about last time?  Workday  Before that:  Image texturing ▪ Magnification ▪ Minification  Mipmapping  Summed area.
Computer Graphics Inf4/MSc Computer Graphics Lecture 7 Texture Mapping, Bump-mapping, Transparency.
Technology and Historical Overview. Introduction to 3d Computer Graphics  3D computer graphics is the science, study, and method of projecting a mathematical.
CS 638, Fall 2001 Admin Grad student TAs may have had their accounts disabled –Please check and the lab if there is a problem If you plan on graduating.
COLLEGE OF ENGINEERING UNIVERSITY OF PORTO COMPUTER GRAPHICS AND INTERFACES / GRAPHICS SYSTEMS JGB / AAS 1 Shading (Shading) & Smooth Shading Graphics.
Interactive Rendering of Meso-structure Surface Details using Semi-transparent 3D Textures Vision, Modeling, Visualization Erlangen, Germany November 16-18,
Shading & Texture. Shading Flat Shading The process of assigning colors to pixels. Smooth Shading Gouraud ShadingPhong Shading Shading.
09/09/03CS679 - Fall Copyright Univ. of Wisconsin Last Time Event management Lag Group assignment has happened, like it or not.
CS 376 Introduction to Computer Graphics 04 / 16 / 2007 Instructor: Michael Eckmann.
09/11/03CS679 - Fall Copyright Univ. of Wisconsin Last Time Graphics Pipeline Texturing Overview Cubic Environment Mapping.
Computer Graphics The Rendering Pipeline - Review CO2409 Computer Graphics Week 15.
CS-378: Game Technology Lecture #4: Texture and Other Maps Prof. Okan Arikan University of Texas, Austin V Lecture #4: Texture and Other Maps.
Advanced Computer Graphics Advanced Shaders CO2409 Computer Graphics Week 16.
Computer Graphics 2 Lecture 7: Texture Mapping Benjamin Mora 1 University of Wales Swansea Pr. Min Chen Dr. Benjamin Mora.
09/16/03CS679 - Fall Copyright Univ. of Wisconsin Last Time Environment mapping Light mapping Project Goals for Stage 1.
Advanced Computer Graphics Spring 2014 K. H. Ko School of Mechatronics Gwangju Institute of Science and Technology.
Review on Graphics Basics. Outline Polygon rendering pipeline Affine transformations Projective transformations Lighting and shading From vertices to.
RENDERING Introduction to Shading models – Flat and Smooth shading – Adding texture to faces – Adding shadows of objects – Building a camera in a program.
Real-Time Relief Mapping on Arbitrary Polygonal Surfaces Fabio Policarpo Manuel M. Oliveira Joao L. D. Comba.
COMPUTER GRAPHICS CS 482 – FALL 2015 SEPTEMBER 29, 2015 RENDERING RASTERIZATION RAY CASTING PROGRAMMABLE SHADERS.
Bump Mapping Ed Angel Professor of Computer Science, Electrical and Computer Engineering, and Media Arts Director, Arts Technology Center University of.
Of Bump Mapping Presented in Real Time by: Kenny Moser Course: ECE8990 Real Time Rendering Presented in Real Time by: Kenny Moser Course: ECE8990 Real.
Module 05 –Bump mapping Module 05 – Bump mapping Module 05 Advanced mapping techniques: Bump mapping.
Real-Time Relief Mapping on Arbitrary Polygonal Surfaces Fabio Policarpo Manuel M. Oliveira Joao L. D. Comba.
Computer Graphics Ken-Yi Lee National Taiwan University (the slides are adapted from Bing-Yi Chen and Yung-Yu Chuang)
Computer Graphics (Fall 2006) COMS 4160, Lecture 16: Illumination and Shading 1
Texturing Tomas Akenine-Möller Department of Computer Engineering Chalmers University of Technology.
- Introduction - Graphics Pipeline
Ying Zhu Georgia State University
CS-378: Game Technology Lecture #7: More Mapping Prof. Okan Arikan
Week 7 - Monday CS361.
Photorealistic Rendering vs. Interactive 3D Graphics
Computer Graphics Chapter 9 Rendering.
Graphics Processing Unit
Deferred Lighting.
3D Graphics Rendering PPT By Ricardo Veguilla.
How to Bump Map a Skinned Polygonal Model
Chapter 4 Texture Mapping
The Graphics Rendering Pipeline
CS451Real-time Rendering Pipeline
© University of Wisconsin, CS559 Fall 2004
(c) 2002 University of Wisconsin
Introduction to Computer Graphics with WebGL
Chapter IX Bump Mapping
Chapter V Vertex Processing
UMBC Graphics for Games
Chapter XIV Normal Mapping
CS5500 Computer Graphics May 29, 2006
CS-378: Game Technology Lecture #4: Texture and Other Maps
Last Time Presentation of your game and idea Environment mapping
Illumination and Shading
Texture Mapping 고려대학교 컴퓨터 그래픽스 연구실.
Computer Graphics Material Colours and Lighting
Computer Graphics (Fall 2003)
Adding Surface Detail 고려대학교 컴퓨터 그래픽스 연구실.
Adding Surface Detail 고려대학교 컴퓨터 그래픽스 연구실.
Presentation transcript:

Bump Mapping -1 Three scales of detail on an object Macro-features that cover many pixels Represented by vertices and triangles Micro-features that are substantially smaller than a pixel Encapsulated in the shading (lighting) model Simulates the light interaction of a surface’s microscopic geometry; e.g., shiny objects are microscopically smooth, and diffuse surfaces are microscopically rough Meso-features that are a few pixels across Describe everything between macro- and micro scales Contains detail that is too complex to render using polygons, but that is large enough for viewers to see changes in curvature over a few pixels Winkles on faces Graphics –U, Chap 4 Texture Mapping

Bump Mapping -2 A large family of small-scale detail representation techniques By modifying the per-pixel shading routine Give a more 3D appearance than texture mapping, but less than actual geometry Bump mapping A family of methods that are used for mesoscale modeling Adjust shading parameters at the pixel level in such a way that the viewer perceives small perturbation away from the base geometry, which actually remains flat Blinn’s methods, normal mapping, Parallax mapping, Relief mapping Graphics –U, Chap 4 Texture Mapping

Bump Mapping -3 Graphics –U, Chap 4 Texture Mapping

Bump Mapping -4 (height field values) Graphics –U, Chap 4 Texture Mapping

Bump Mapping -5 Blinn’s methods Through variations in surface normal Surface normal is perturbed according to a 2D bump map. This tricks a local reflection model into producing appearance of bumpy details on a smooth surface The geometry does not change Bump map Two signed value (bu, bv) Used to scale two vectors that are perpendicular to normal A heightfield Used to derive (bu, bv) Graphics –U, Chap 4 Texture Mapping

Bump Mapping -6 Bump map of two signed values (bu, bv) (height field values) Bump map of two signed values (bu, bv) Bump map of highfield values Graphics –U, Chap 4 Texture Mapping

Bump Mapping -7 (height field values) Graphics –U, Chap 4 Texture Mapping

Bump Mapping -8 Let P(u, v) be a parametric surface Perturb the normal N to N’ N’ is the normal of offset surface P’ The displacement must lie on the tangent plane of P Two arrays that contain partials of B(u,v) can be precomputed. Last terms can be neglected if B(u,v) is small. Graphics –U, Chap 4 Texture Mapping

Bump Mapping -9 Graphics –U, Chap 4 Texture Mapping

Bump Mapping -10 Graphics –U, Chap 4 Texture Mapping

Bump Mapping -11 on polygonal mesh Pu x Pv is not normalized. Pu x Pv changes at different rate than D – not a desirable feature A full implementation of the equation in a rasterizer is impractical So computation is divided among a preprocessing, per-vertex, and per-pixel calculation Compute Pu x Pv, Pv x N, N x Pu at vertices and interpolate them to polygon interiors. (with possible hardware support) For each pixel inside the triangle the perturbed normal vector is computed and normalized, with Bu and Bv read from a texture map Compute the per-pixel lighting must be in tangent space to get stable bump mapping for animated or deformed objects!! Graphics –U, Chap 4 Texture Mapping

Bump Mapping -12 on polygonal mesh Emboss bump mapping Borrowed from image embossing One of the first methods for real-time bump mapping A cheap bump mapping implementation, but not really bump mapping Uses diffuse lighting only and has no specular effect Under-sampling artifacts Possible on today’s hardware If it looks ok, do it! Real bump mapping uses per-pixel lighting based on the perturbed normal vectors. Graphics –U, Chap 4 Texture Mapping

Bump Mapping -13 on polygonal mesh Dot product (Dot 3) bump mapping The primary method for modern graphics hardware Bump map texture stores the actual normal N’ to be used for bump mapping The normal defined in pixel’s tangent space Per-fragment lighting computed in pixel’s tangent space: N’  L (or diffuse) and (N’  H)^8 (for specular) Graphics –U, Chap 4 Texture Mapping

Dot3 bump map -1 Normal map Bump map image Graphics –U, Chap 4 Texture Mapping

Dot3 bump map -2 For a light source The light source’s location is transformed to the surface’s tangent space basis at each triangle’s vertex Interpolate among the un-normalized light vectors and normalize the resulting vector, which is converted to 8-bits values in the range [-1, 1] By indexing to a specially build cubic environment map using the vector direction, any vector can be quickly renormalized. The cubic environment map stores normalized vectors, as 8-bit signed values, for each direction The normal from the normal map is then combined with the interpolated light vector at each pixel using dot product Performed by a special texture blending function provided precisely for this purpose This step obtains diffuse lighting Graphics –U, Chap 4 Texture Mapping

Dot3 bump map -3 References (available in NVIDIA developer) Specular lighting can also be easily derived Instead of using the vector to the light, interpolate the half angle vector H H=(L+v)/|L+v|, and dot(r, v) = dot(n, H) Using pixel shader, the dot (n, H) specular term can be raised to the 8th power References (available in NVIDIA developer) A Practical and Robust Bump-mapping Technique for Today’s GPUs by Mark J. Kilgard, NVIDIA Corporation Hardware bump mapping by D. Sim, in Game Programming Gems Graphics –U, Chap 4 Texture Mapping

Dot3 bump map -4 Given a mesh and a normal map For each polygon The light source’s location is transformed to the surface’s tangent space basis at each vertex For each pixel Interpolated the un-normalized light vector, and normal it using the cubic environment map Dot product the light vector and the normal at the normal map, i.e., dot(n, L), using texture-blending. Interpolated the un-normalized half angle vector, and normalize it using the cubic environment map Dot product the normal at the normal map and the half angle vector, i.e., dot(n, H), using texture-blending, and then raised to the power Graphics –U, Chap 4 Texture Mapping

Dot3 bump map -5 Vector normalization cube map Performs per-pixel normalization Denormalized vector (s, t, r) Use the component of the largest magnitude to determine the pierced cube map face +x, -x, +y, -y, +z, -z The remaining two components are divided by the component of the largest magnitude This effectively projects these two components onto the selected face The two projected components are scaled and biased to the [0, 1] texel range, depending on the selected face to compute a 2D texture coordinate. Graphics –U, Chap 4 Texture Mapping

Dot3 bump map -6 Vector normalization cube map Graphics –U, Chap 4 Texture Mapping

Dot3 bump map -7 [Mark J. Kilgard] Result Height field texture Color texture modulated by the diffuse contribution computed using a normal bump map Specular contribution computed using a normalized version of normal map Result Height field texture [Mark J. Kilgard] Graphics –U, Chap 4 Texture Mapping

Dot3 bump map -8 Dot3 Bump Mapping Height field texture Graphics –U, Chap 4 Texture Mapping

Dot3 bump map -9 Original texture GLBump2 demo!! Bump map Graphics –U, Chap 4 Texture Mapping

Dot3 bump map -10 Bump map + shadowing md2shader demo!! Graphics –U, Chap 4 Texture Mapping

Dot3 bump map -11 Bump map + shadowing md2shader demo!! [Crysis on line game] Graphics –U, Chap 4 Texture Mapping

Dot3 bump map -12 Graphics –U, Chap 4 Texture Mapping Gouraud shading Phong shading Phong shading with Bump map Graphics –U, Chap 4 Texture Mapping

Dot3 bump map -13 A wavy height field bump map Bump mapping effect Gouraud shading Phong shading Phong shading with Bump map A wavy height field bump map Bump mapping effect Graphics –U, Chap 4 Texture Mapping

Displacement mapping Sample the height field to find the amount of displacement, and then displace each vertex along its normal. Adds surface detail, while resulting correct silhouette and no parallax errors, and also can create model itself Modeling Example: Terrain = a flat plane + a displacement map Base surfaces: mesh, subdivision surfaces Reference Displacement mapping on the GPU – State of art, by Szirmay-Kalos and Umenhoffer, CGF, Vol. 25, No. 3, 2006 Displacement Mapping, by Michael Doggett, ATI Research GDC 2003 Displacement Mapping, by Tom Forsyth See GDC2003-DisplacementMappingNotes.pdf from NVIDIA Developer Graphics –U, Chap 4 Texture Mapping

Displacement mapping Displacement map: Store scalar displacement Graphics –U, Chap 4 Texture Mapping

Displacement mapping Example Graphics –U, Chap 4 Texture Mapping

Displacement mapping Example Graphics –U, Chap 4 Texture Mapping

Displacement mapping Base surface and maps Low-poly “base” mesh Derived using polygon simplification 2D-to-3D mapping Mesh parameterization Maps Height-field scalar displacement map Noraml map Most tools apply to both normal and displacement maps Graphics –U, Chap 4 Texture Mapping

Displacement mapping Deriving maps Ray casting based method A ray is cast from that position along the interpolated normal, and the intersection with high-polygon mesh is found The normal of the high-polygon mesh is written to normal map The distance along the ray to the intersection point is written to the displacement map Based on a high-polygon mesh and a parameterization Derive the displacement and normal maps for low-polygon base meshes Each texel on the map has a single position on the low-polygon mesh Graphics –U, Chap 4 Texture Mapping

Displacement mapping Rendering Take sample points and displace them perpendicularly to the macrostructure surface with the distance obtained from the map. Per-vertex displacement map Sample points can be vertices of the tessellated mesh GPU per-vertex displacement mapping using vertex shader or geometry shader Displaced geometry goes through rendering pipeline Per-pixel displacement map (inverse mapping) Sample points corresponding to the texel centers Surface details are added when color texturing takes place. GPU-based: ray tracing using fragment shader [from 1998] Graphics –U, Chap 4 Texture Mapping

Per-vertex displacement mapping [DP on the GPU: State of art] Graphics –U, Chap 4 Texture Mapping

Per-vertex displacement mapping [Moule/McCool: Efficient bounded adaptive tessellation of displacement maps] Graphics –U, Chap 4 Texture Mapping

Per-vertex displacement mapping [DP on the GPU: State of art] [Moule/McCool: Efficient bounded adaptive tessellation of displacement maps] Graphics –U, Chap 4 Texture Mapping

Per-vertex displacement mapping Problems The number of vertices can be very high <-> the aim of DM GPUs have usually more pixel-processing power than vertex-processing power Pixel shaders are better equipped to access textures. Texture access in vertex shader is slower than in the fragment shader. Vertex shader always executes once for each vertex in the model, even to invisible or hardly visible parts Graphics –U, Chap 4 Texture Mapping

Per-pixel displacement mapping Vertex shader process only the base mesh Surface height map is taken into account when fragments are processed Use ray tracing to resolve the visibility problem Ray tracing into height field to obtain the texture coordinate of the visible point, which are used to fetch color and normal vector. For each processed point (u, v, 0) in tangent space, fragment shader finds the point of the height field which is really seen by the ray connecting (u, v, 0) and the pixel center. Ray direction: tangent space view vector Graphics –U, Chap 4 Texture Mapping

Per-pixel displacement mapping [DP on the GPU: State of art] Graphics –U, Chap 4 Texture Mapping

Per-pixel displacement mapping [DP on the GPU: State of art] Graphics –U, Chap 4 Texture Mapping

Multitexturing -1 Two or more textures are applied in a single rendering pass Hardware supported to render in a single pass Multiple textures can be fetched when processing a single fragment Draw object once with several textures, some of which can be dynamically modified. E.g., The texture map may remain unchanged from frame to frame while the light map may be updated by a dynamic light and the fog map may be changed by the moving camera The texture coordinate and vertex correspondence can be different for each texture map Reference See Real Time Rendering, 2nd ed., p146 See JL Mitchell’s page at http://www.pixelmaven.com/jason/ Graphics –U, Chap 4 Texture Mapping

Multitexturing -2 Multitexturing can Save rendering passes Allows more complex shading models than does the application of a single texture per pass Example: compute a lighting model with expression AB+CD, where each variable represents a different color texture’s value Impossible to evaluate without multitexturing or using off-screen rendering (stores images that can be combined later) A multipass rendering algorithm could combine AB in two passes and add C in the next, but it could not fold in D because C would already have been added to AB. There is no place to keep C separate from AB, since only one color can be stored in the frame buffer. Graphics –U, Chap 4 Texture Mapping

Multitexturing -3 Combine the results of these texture accesses, a texture blending cascade is defined that is made up of a series of texture stages. Graphics –U, Chap 4 Texture Mapping

Multitexturing -3 Two maps: a color map and a light map Texture map Graphics –U, Chap 4 Texture Mapping

3D Texture Mapping - 1 Texture is determined by the intersection of the surface with the predefined 3D texture field. Texture field is obtained by procedure generation. Vertex coordinates are used to index a procedure (e.g. 3D noise function) that defines the 3D texture field for that point. Advantages is successful at simulating turbulence (e.g., marble objects). eliminates mapping problems. objects of arbitrary complexity can receive a texture in a coherent fashion. Graphics –U, Chap 4 Texture Mapping

3D Texture Mapping - 2 3D texture mapping in object space Graphics –U, Chap 4 Texture Mapping

3D Texture Mapping - 3 3D mapping and 3D noise Graphics –U, Chap 4 Texture Mapping

3D Texture Mapping - 4 Graphics –U, Chap 4 Texture Mapping