Presentation is loading. Please wait.

Presentation is loading. Please wait.

Visual Appearance (Shading & Texture mapping) Course Note Credit: Some of slides are extracted from the course notes of prof. J. Lee (SNU), prof. M. Desbrun.

Similar presentations

Presentation on theme: "Visual Appearance (Shading & Texture mapping) Course Note Credit: Some of slides are extracted from the course notes of prof. J. Lee (SNU), prof. M. Desbrun."— Presentation transcript:

1 Visual Appearance (Shading & Texture mapping) Course Note Credit: Some of slides are extracted from the course notes of prof. J. Lee (SNU), prof. M. Desbrun (USC) and prof. H.-W. Shen (OSU) and others.

2 Shading (Local illumination)

3 Photorealism in Computer Graphics  Photorealism in computer graphics involves  Accurate representations of surface properties, and  Good physical descriptions of the lighting effects  Modeling the lighting effects that we see on an object is a complex process, involving principles of both physics and psychology  Physical illumination models involve  Material properties, object position relative to light sources and other objects, the features of the light sources, and so on

4 Illumination and Rendering  An illumination model in computer graphics  also called a lighting model or a shading model  used to calculate the color of an illuminated position on the surface of an object  Approximations of the physical laws  A surface-rendering method determine the pixel colors for all projected positions in a scene

5 Light Sources  Point light sources  Emitting radiant energy at a single point  Specified with its position and the color of the emitted light  Infinitely distant light sources  A large light source, such as sun, that is very far from a scene  Little variation in its directional effects  Specified with its color value and a fixed direction for the light rays

6 Light Sources  Directional light sources  Produces a directional beam of light  Spotlight effects  Area light sources

7 Light Sources  Radial intensity attenuation  As radiant energy travels, its amplitude is attenuated by the factor  Sometimes, more realistic attenuation effects can be obtained with an inverse quadratic function of distance  The intensity attenuation is not applied to light sources at infinity because all points in the scene are at a nearly equal distance from a far-off source

8 Light Sources  Angular intensity attenuation  For a directional light, we can attenuate the light intensity angularly as well as radially

9 Surface Lighting Effects  An illumination model computes the lighting effects for a surface using the various optical properties  Degree of transparency, color reflectance, surface texture  The reflection (phong illumination) model describes the way incident light reflects from an opaque surface  Diffuse, ambient, specular reflections  Simple approximation of actual physical models

10 Diffuse Reflection  Incident light is scattered with equal intensity in all directions  Such surfaces are called ideal diffuse reflectors (also referred to as Lambertian reflectors)

11 Lambert’s Cosine Law  The reflected luminous intensity in any direction from a perfectly diffusing surface varies as the cosine of the angle between the direction of incident light and the normal vector of the surface.  Intuitively: cross-sectional area of the “beam” intersecting an element of surface area is smaller for greater angles with the normal.

12 Diffuse Reflection : the intensity of the light source : diffuse reflection coefficient, : the surface normal (unit vector) : the direction of light source, (unit vector) Ideally diffuse surfaces obey cosine law. Often called Lambertian surfaces.

13 Ambient Light  Multiple reflection of nearby (light-reflecting) objects yields a uniform illumination  A form of diffuse reflection independent of the viewing direction and the spatial orientation of a surface  Ambient illumination is constant for an object : the incident ambient intensity : ambient reflection coefficient, the proportion reflected away from the surface

14 Ambient + Diffuse

15 Specular Reflection  Perfect reflector (mirror) reflects all lights to the direction where angle of reflection is identical to the angle of incidence  It accounts for the highlight

16 Specular Reflection  Phong specular-reflection model  Note that N, L, and R are coplanar, but V may not be coplanar to the others : intensity of the incident light : color-independent specular coefficient : the gloss of the surface

17 Specular Reflection  Glossiness of surfaces

18 Specular Reflection  Specular-reflection coefficient k s is a material property  For some material, k s varies depending on   k s =1 if  =90°  Calcularing the reflection vector R

19 Specular Reflection  Simplified Phong model using halfway vector  H is constant if both viewer and the light source are sufficiently far from the surface

20 Ambient+Diffuse+Specular Reflections  Single light source  Multiple light source

21 Parameter Choosing Tips  For a RGB color description, each intensity and reflectance specification is a three-element vector  The sum of reflectance coefficients is usually smaller than one  Try n in the range [0, 100]  Use a small k a (~0.1)  Example  Metal: n=90, k a =0.1, k d =0.2, k s =0.5


23 Atmospheric Effects  A hazy atmosphere makes colors fade and objects appear dimmer  Hazy-atmosphere effect is often simulated with an exponential attenuation function such as  Higher values for  produce a denser atmosphere

24 Polygon Rendering Methods  We could use an illumination model to determine the surface intensity at every projected pixel position  Or, we could apply the illumination model to a few selected points and approximate the intensity at the other surface positions  Curved surfaces are often approximated by polygonal surfaces  So, polygonal (piecewise planar) surfaces often need to be rendered as if they are smooth

25 Constant-Intensity Surface Rendering  Constant (flat) shading  Each polygon is one face of a polyhedron and is not a section of a curved-surface approximation mesh

26 Intensity-Interpolation Surface Rendering  Gouraud shading  Rendering a curved surface that is approximated with a polygon mesh  Interpolate intensities at polygon vertices  Procedure 1.Determine the average unit normal vector at each vertex 2.Apply an illumination model at each polygon vertex to obtain the light intensity at that position 3.Linearly interpolate the vertex intensities over the projected area of the polygon

27 Intensity-Interpolation Surface Rendering  Normal vectors at vertices  Averaging the normal vectors for each polygon sharing that vertex )( NNNN NNNN N v   

28 Intensity-Interpolation Surface Rendering  Intensity interpolation along scan lines  I 4 and I 5 are interpolated along y-axis, and then  I p is interpolated along x-axis  Incremental calculation is also possible y x p scan line

29 Intensity-Interpolation Surface Rendering

30 Gouraud Shading Problems  Lighting in the polygon interior is inaccurate  Mach band effect  Optical illusion

31 Mach Band Effect  These “Mach Bands” are not physically there. Instead, they are illusions due to excitation and inhibition in our neural processing The bright bands at 45 degrees (and 135 degrees) are illusory. The intensity of each square is the same.

32 Mach Band artifact (Gouraud Shading)  Mach bands:  Caused by interaction of neighboring retinal neurons.  Acts as a sort of high-pass filter, accentuating discontinuities in first derivative.  Linear interpolation causes first deriv. Discontinuities at polygon edges.

33 Normal-Vector Interpolation for Surface Rendering  Phong shading  Interpolate normal vectors at polygon vertices  Procedure 1.Determine the average unit normal vector at each vertex 2.Linearly interpolate the vertex normals over the projected area of the polygon 3.Apply an illumination model at positions along scan lines to calculate pixel intensities n1 n2 n3

34 Gouraud versus Phong Shading  Gouraud shading is faster than Phong shading  OpenGL supports Gouraud shading  Phong shading is more accurate

35 Texture Mapping

36 Can you do this …

37 Visual Realism

38 Texture Mapping  Surfaces in real world are very complex  Objects have properties that vary across surface  Cannot model all the fine variations  We need to find ways to add surface detail  How?

39 Texture Mapping  Of course, one can model the exact micro- geometry + material property to control the look and feel of a surface  But, it may get extremely costly  So, graphics use a more practical approach – texture mapping

40 Texture Mapping  Texture Mapping  Want a function that assigns a color to each point  The the surface is a 2D domain, so that is essentially an image  can represent using any image representation  raster texture images are very popular

41 Photo-textures

42 OpenGL functions - demo  Usually texture is a (2D/3D) image.  During initialization read in or create the texture image and place it into the OpenGL state. glTexImage2D (GL_TEXTURE_2D, 0, GL_RGB, imageWidth, imageHeight, 0, GL_RGB, GL_UNSIGNED_BYTE, imageData);  Before rendering your textured object, enable texture mapping and tell the system to use this particular texture. glBindTexture (GL_TEXTURE_2D, 13);

43 OpenGL functions  During rendering, give the cartesian coordinates and the texture coordinates for each vertex. glBegin (GL_QUADS); glTexCoord2f (0.0, 0.0); glVertex3f (0.0, 0.0, 0.0); glTexCoord2f (1.0, 0.0); glVertex3f (10.0, 0.0, 0.0); glTexCoord2f (1.0, 1.0); glVertex3f (10.0, 10.0, 0.0); glTexCoord2f (0.0, 1.0); glVertex3f (0.0, 10.0, 0.0); glEnd ();

44 Texture and Texel  Each pixel in a texture map is called a Texel  Each Texel is associated with a (u,v) 2D texture coordinate  The range of u, v is [0.0,1.0] due to normalization

45 (u,v) tuple  For any (u,v) in the range of (0-1, 0-1) multiplied by texture image width and height, we can find the corresponding value in the texture map

46 How do we get F(u,v)?  We are given a discrete set of values:  F[i,j] for i= 0,…,N, j= 0,…,M  Nearest neighbor:  F  F(u,v) = F[ round(N*u), round(M*v) ]  Linear Interpolation:  i = floor(N*u), j = floor(M*v)  interpolate from F[i,j], F[i +1,j], F[i,j +1 ], F[i +1,j]  Filtering in general !

47 Interpolation Nearest neighborLinear Interpolation

48 Specifying texture coordinates  Texture coordinates needed at every vertex  Hard to specify by hand  Difficult to wrap a 2D texture around a 3D object

49 Texture Filtering  Resampling using mip mapping  Magnification: Interpolation  Minification:Averaging TextureImage Texture => Image We would like a constant cost per pixel

50 Mip Mapping  MIP = Multim In Parvo = Many things in a small place  Constructs an image pyramid. Each level is a prefilteredversion of the level below resampled at half the frequency.  Whilerasterizing use the level with the sampling rate closest to the desired sampling rate.

51 Mip Mapping Used with bilinear/trilinear interpolation G R B R GB Trilinear interpolation

52 Mip Mapping - Example Courtesy of John hart used to save some of the filtering work needed during texture minification

53 Texture Filtering Methods  Nearest Neighbor interpolation  Bilinear Interpolation  Trilinear Interpolation  Anisotropic filtering  Precomputed rectangular(trapezoidal) maps { "@context": "", "@type": "ImageObject", "contentUrl": "", "name": "Texture Filtering Methods  Nearest Neighbor interpolation  Bilinear Interpolation  Trilinear Interpolation  Anisotropic filtering  Precomputed rectangular(trapezoidal) maps

54 example  No mip mapping vs with mip mapping

55 example  Bilinear mip mapping vs trilinear mip mapping

56 Specifying texture coordinates  Texture coordinates needed at every vertex  Hard to specify by hand  Difficult to wrap a 2D texture around a 3D object

57 Planar mapping  Compute texture coordinates at each vertex by projecting the map coordinates onto the model

58 Cylindrical Mapping

59 Spherical Mapping

60 Cube Mapping

61 Modelling Surface Properties  Can use a texture to supply any parameter of the illumination model  ambient, diffuse, and specularcolor  Specular exponent  roughness

62 Environment Maps Use texture to represent reflected color  Texture indexed by reflection vector  Approximation works when objects are far away from the reflective object

63 Environment Maps Using a spherical environment map Spatially variant resolution

64 Environment Maps Using a cubical environment map

65 Environment Mapping  Environment mapping produces reflections on shiny objects  Texture is transferred in the direction of the reflected ray from the environment map onto the object  Reflected ray: R=2(N·V)N-V Object Viewer Reflected ray Environment Map

66 Approximations Made  The map should contain a view of the world with the point of interest on the object as the eye  We can’t store a separate map for each point, so one map is used with the eye at the center of the object  Introduces distortions in the reflection, but the eye doesn’t notice  Distortions are minimized for a small object in a large room  The mapping can be computed at each pixel, or only at the vertices

67 Example

68 Refraction Maps Use texture to represent refraction

69 Opacity Maps Use texture to represent opacity RGB channels alpha channels Use the alpha channel to make portions of the texture transparent. Cheaper than explicit modelling

70 Illumination Maps Use texture to represent illumination footprint

71 Illumination Maps Quake light maps

72 Bump Mapping  Use texture to perturb normals  Textures are treated as height field  creates a bump-like effect + = original surface bump map modified surface Does not change silhouette edges

73 Bump mapping

74 Normal Mapping (bump mapping)  Replace normals

75 Bump Mapping  Many textures are the result of small perturbations in the surface geometry  Modeling these changes would result in an explosion in the number of geometric primitives.  Bump mapping attempts to alter the lighting across a polygon to provide the illusion of texture.

76 Displacement Mapping Use texture to displace the surface geometry + = Bump mapping only affects the normals, Displacement mapping changes the entire surface (including the silhouette)

77 Displacement Mapping

78 3D Textures Use a 3D mapping Can simulate an object carved from a material Marble or wood The object is “carved”out of the solid texture

79 Texture Synthesis  Texture Synthesis

80 Transparency  Alpha Blending  a: foreground object, b: background object, o: output  Alpha : opacity, C : color

Download ppt "Visual Appearance (Shading & Texture mapping) Course Note Credit: Some of slides are extracted from the course notes of prof. J. Lee (SNU), prof. M. Desbrun."

Similar presentations

Ads by Google