Chapter XIV Normal Mapping

Slides:



Advertisements
Similar presentations
Virtual Realism TEXTURE MAPPING. The Quest for Visual Realism.
Advertisements

Michael I. Gold NVIDIA Corporation
Topics in Computer Graphics Spring Application.
Exploration of bump, parallax, relief and displacement mapping
Bump Mapping CSE 781 Roger Crawfis.
Graphics Pipeline.
Texture Mapping. Texturing  process that modifies the appearance of each point on a surface using an image or function  any aspect of appearance can.
Week 7 - Monday.  What did we talk about last time?  Specular shading  Aliasing and antialiasing.
3D Graphics for Game Programming (J. Han) Chapter VIII Image Texturing.
Status – Week 277 Victor Moya.
1 Texturing. 2 What is Texturing? 3 Texture Mapping Definition: mapping a function onto a surface; function can be:  1, 2, or 3D  sampled (image) or.
Image Processing.  a typical image is simply a 2D array of color or gray values  i.e., a texture  image processing takes as input an image and outputs.
Computer Graphics Inf4/MSc Computer Graphics Lecture 7 Texture Mapping, Bump-mapping, Transparency.
University of Illinois at Chicago Electronic Visualization Laboratory (EVL) CS 426 Intro to 3D Computer Graphics © 2003, 2004, 2005 Jason Leigh Electronic.
COMP 175: Computer Graphics March 24, 2015
Technology and Historical Overview. Introduction to 3d Computer Graphics  3D computer graphics is the science, study, and method of projecting a mathematical.
CS 638, Fall 2001 Admin Grad student TAs may have had their accounts disabled –Please check and the lab if there is a problem If you plan on graduating.
Geometric Objects and Transformations. Coordinate systems rial.html.
Week 2 - Wednesday CS361.
09/09/03CS679 - Fall Copyright Univ. of Wisconsin Last Time Event management Lag Group assignment has happened, like it or not.
1 Introduction to Computer Graphics with WebGL Ed Angel Professor Emeritus of Computer Science Founding Director, Arts, Research, Technology and Science.
Computer Graphics The Rendering Pipeline - Review CO2409 Computer Graphics Week 15.
Advanced Computer Graphics Advanced Shaders CO2409 Computer Graphics Week 16.
Shadow Mapping Chun-Fa Chang National Taiwan Normal University.
CHAPTER 8 Color and Texture Mapping © 2008 Cengage Learning EMEA.
09/16/03CS679 - Fall Copyright Univ. of Wisconsin Last Time Environment mapping Light mapping Project Goals for Stage 1.
CS559: Computer Graphics Lecture 8: Warping, Morphing, 3D Transformation Li Zhang Spring 2010 Most slides borrowed from Yungyu ChuangYungyu Chuang.
Review on Graphics Basics. Outline Polygon rendering pipeline Affine transformations Projective transformations Lighting and shading From vertices to.
Lecture 6 Rasterisation, Antialiasing, Texture Mapping,
Lab: Vertex Shading Chris Wynn Ken Hurley. Objective Hands-On vertex shader programming Start with simple programs … Part 1: Textured-Lit Teapot.
What are shaders? In the field of computer graphics, a shader is a computer program that runs on the graphics processing unit(GPU) and is used to do shading.
Render methods. Contents Levels of rendering Wireframe Plain shadow Gouraud Phong Comparison Gouraud-Phong.
Module 05 –Bump mapping Module 05 – Bump mapping Module 05 Advanced mapping techniques: Bump mapping.
Texturing CMSC 435/ What is Texturing? 2 Texture Mapping Definition: mapping a function onto a surface; function can be: – 1, 2, or 3D – sampled.
Applications and Rendering pipeline
Shaders, part 2 alexandri zavodny.
- Introduction - Graphics Pipeline
Ying Zhu Georgia State University
Computer Graphics Imaging
Week 7 - Wednesday CS361.
Week 7 - Monday CS361.
Chapter X Advanced Texturing
Tips for Shading Your Terrain
The Graphic PipeLine
ATCM 3310 Procedural Animation
3D Graphics Rendering PPT By Ricardo Veguilla.
Bump Mapping -1 Three scales of detail on an object
CS451Real-time Rendering Pipeline
Chapter 14 Shading Models.
Chapters VIII Image Texturing
Introduction to Computer Graphics with WebGL
Day 05 Shader Basics.
Chapter IX Bump Mapping
Computer Animation Texture Mapping.
Chapter V Vertex Processing
Chapter XVI Texturing toward Global Illumination
Procedural Animation Lecture 6: Mapping
ATCM 6317 Procedural Animation
ICG 2018 Fall Homework1 Guidance
Chapter IX Lighting.
CS-378: Game Technology Lecture #4: Texture and Other Maps
Chapter XVIII Surface Tessellation
Last Time Presentation of your game and idea Environment mapping
Computer Graphics Introduction to Shaders
Texture Mapping 고려대학교 컴퓨터 그래픽스 연구실.
Game Programming Algorithms and Techniques
Chapter 14 Shading Models.
03 | Creating, Texturing and Moving Objects
Adding Surface Detail 고려대학교 컴퓨터 그래픽스 연구실.
Adding Surface Detail 고려대학교 컴퓨터 그래픽스 연구실.
Presentation transcript:

Chapter XIV Normal Mapping

Bumpy Surfaces Rendering with a high-frequency polygon mesh Unfortunately, it is not cheap to process the high-resolution mesh.

Bumpy Surfaces (cont’d) Rendering with a low-resolution mesh It is cheap to process such a low-resolution mesh. However, the quad does not properly expose the bumpy features.

Normal Mapping A way out of this dilemma is to pre-compute and store the normals of the high-frequency surface into a special texture named normal map, and to use a lower-resolution mesh at run time which we call base surface and fetch the normals from the normal map for lighting. normal map normal-mapped base surface

Height Field A popular method to represent a high-frequency surface is to use a height field. It is a function h(x,y) that returns a height or z value given (x,y) coordinates. The height field is sampled with a 2D array of regularly spaced (x,y) coordinates, and the height values are stored in a texture named height map. The height map can be drawn in gray scales. If the height is in the range of [0,255], the lowest height 0 is colored in black, and the highest 255 is colored in white.

Normal Map Simple image-editing operations can create a gray-scale image (height map) from an image texture (from (a) to (b)). The next step from (b) to (c) is done automatically.

Normal Map (cont’d) Creation of a normal map from a height map Visualization of a normal map

Normal Mapping (cont’d) The polygon mesh is rasterized and texture coordinates (s,t) are used to access the normal map. The normal at (s,t) is obtained by filtering the normal map. Consider the diffuse reflection term, max(n•l,0)sdmd. The normal n is fetched from the normal map. md is fetched from the image texture.

Normal Mapping (cont’d) Vertex shader

Normal Mapping (cont’d) Fragment shader

Normal Mapping (cont’d) Fragment shader

Normal Mapping (cont’d)

Tangent-space Normal Mapping Recall that texturing is described as wrapping a texture onto an object surface. We should be able to paste it to various surfaces. For this, we need an additional process, which is not performed during image texturing.

Tangent-space Normal Mapping (cont’d) For a surface point, consider a tangent space that is defined by three orthonormal vectors: T (for tangent) B (for bitangent) N (for normal) The tangent spaces vary across the object surface. The normal fetched from the normal map using p’s texture coordinates, (sp,tp), is denoted as n(sp,tp). Without normal mapping, Np would be used for lighting. In normal mapping, however, n(sp,tp) replaces Np, which is (0,0,1) in the tangent space of p. This implies that, as a ‘perturbed’ instance of (0,0,1), n(sp,tp) can be considered to be defined in the tangent space of p. Whatever surface point is normal-mapped, the normal fetched from the normal map is considered to be defined in the tangent space of that point.

Tangent Space Consider dot(normal, light) in p.11. It does not work in general because normal is a tangent-space vector but light is a world-space vector. Two vectors defined in different spaces may not be combined through dot product. In the example, dot(normal, light) worked because the basis of the world space happened to be identical to that of the tangent space for every point of the base surface, the quad.

Tangent Space (cont’d) The basis of the tangent space {T,B,N} Vertex normal N – defined per vertex at the modeling stage. Tangent T – needs to be computed Bitangent B – needs to be computed There are many utility functions that compute T and B on each vertex of a polygon mesh.

Tangent-space Normal Mapping (cont’d) Consider the diffuse term of the Phong lighting model A light source is defined in the world space, and so is l. In contrast, n fetched from the normal map is defined in the tangent space. To resolve this inconsistency, n has to be transformed into the world space, or l has to be transformed into the tangent space. Typically, the per-vertex TBN-basis is pre-computed, is stored in the vertex array and is passed to the vertex shader. The vertex shader first transforms T, B, and N into the world space and then constructs a matrix that rotates the world-space light vector into the per-vertex tangent space.

Tangent-space Normal Mapping (cont’d) Observe that normal is used for defining the transform to the tangent space and is not output.

Tangent-space Normal Mapping (cont’d)

Tangent-space Normal Mapping (cont’d)

Authoring Normal Maps Sculpting using Zbrush The original low-resolution model in (a) is taken as the base surface, and the high-frequency model in (e) is taken as the reference surface.

Authoring Normal Maps (cont’d) Normal map creation (in Zbrush)

Authoring Normal Maps (cont’d) Normal map creation (in Zbrush)