1Visual Appearance (Shading & Texture mapping) Course Note Credit: Some of slides are extracted from the course notes of prof. J. Lee (SNU),prof. M. Desbrun (USC) and prof. H.-W. Shen (OSU) and others.
3Photorealism in Computer Graphics Photorealism in computer graphics involvesAccurate representations of surface properties, andGood physical descriptions of the lighting effectsModeling the lighting effects that we see on an object is a complex process, involving principles of both physics and psychologyPhysical illumination models involveMaterial properties, object position relative to light sources and other objects, the features of the light sources, and so on
4Illumination and Rendering An illumination model in computer graphicsalso called a lighting model or a shading modelused to calculate the color of an illuminated position on the surface of an objectApproximations of the physical lawsA surface-rendering method determine the pixel colors for all projected positions in a scene
5Light Sources Point light sources Infinitely distant light sources Emitting radiant energy at a single pointSpecified with its position and the color of the emitted lightInfinitely distant light sourcesA large light source, such as sun, that is very far from a sceneLittle variation in its directional effectsSpecified with its color value and a fixed direction for the light rays
6Light Sources Directional light sources Area light sources Produces a directional beam of lightSpotlight effectsArea light sources
7Light Sources Radial intensity attenuation As radiant energy travels, its amplitude is attenuated by the factorSometimes, more realistic attenuation effects can be obtained with an inverse quadratic function of distanceThe intensity attenuation is not applied to light sources at infinity because all points in the scene are at a nearly equal distance from a far-off source
8Light Sources Angular intensity attenuation For a directional light, we can attenuate the light intensity angularly as well as radially
9Surface Lighting Effects An illumination model computes the lighting effects for a surface using the various optical propertiesDegree of transparency, color reflectance, surface textureThe reflection (phong illumination) model describes the way incident light reflects from an opaque surfaceDiffuse, ambient, specular reflectionsSimple approximation of actual physical models
10Diffuse ReflectionIncident light is scattered with equal intensity in all directionsSuch surfaces are called ideal diffuse reflectors (also referred to as Lambertian reflectors)
11Lambert’s Cosine LawThe reflected luminous intensity in any direction from a perfectly diffusing surface varies as the cosine of the angle between the direction of incident light and the normal vector of the surface.Intuitively: cross-sectional area of the “beam” intersecting an element of surface area is smaller for greater angles with the normal.
12Diffuse Reflection : the intensity of the light source Ideally diffuse surfaces obey cosine law.Often called Lambertian surfaces.: the intensity of the light source: diffuse reflection coefficient,: the surface normal (unit vector): the direction of light source, (unit vector)
13Ambient LightMultiple reflection of nearby (light-reflecting) objects yields a uniform illuminationA form of diffuse reflection independent of the viewing direction and the spatial orientation of a surfaceAmbient illumination is constant for an object: the incident ambient intensity: ambient reflection coefficient, the proportion reflected away from the surface
15Specular ReflectionPerfect reflector (mirror) reflects all lights to the direction where angle of reflection is identical to the angle of incidenceIt accounts for the highlight
16Specular Reflection Phong specular-reflection model Note that N, L, and R are coplanar, but V may not be coplanar to the others: intensity of the incident light: color-independent specular coefficient: the gloss of the surface
18Specular ReflectionSpecular-reflection coefficient ks is a material propertyFor some material, ks varies depending on qks =1 if q =90°Calcularing the reflection vector R
19Specular Reflection Simplified Phong model using halfway vector H is constant if both viewer and the light source are sufficiently far from the surface
20Ambient+Diffuse+Specular Reflections Single light sourceMultiple light source
21Parameter Choosing Tips For a RGB color description, each intensity and reflectance specification is a three-element vectorThe sum of reflectance coefficients is usually smaller than oneTry n in the range [0, 100]Use a small ka (~0.1)ExampleMetal: n=90, ka=0.1, kd=0.2, ks=0.5
23Atmospheric EffectsA hazy atmosphere makes colors fade and objects appear dimmerHazy-atmosphere effect is often simulated with an exponential attenuation function such asHigher values for r produce a denser atmosphere
24Polygon Rendering Methods We could use an illumination model to determine the surface intensity at every projected pixel positionOr, we could apply the illumination model to a few selected points and approximate the intensity at the other surface positionsCurved surfaces are often approximated by polygonal surfacesSo, polygonal (piecewise planar) surfaces often need to be rendered as if they are smooth
25Constant-Intensity Surface Rendering Constant (flat) shadingEach polygon is one face of a polyhedron and is not a section of a curved-surface approximation mesh
26Intensity-Interpolation Surface Rendering Gouraud shadingRendering a curved surface that is approximated with a polygon meshInterpolate intensities at polygon verticesProcedureDetermine the average unit normal vector at each vertexApply an illumination model at each polygon vertex to obtain the light intensity at that positionLinearly interpolate the vertex intensities over the projected area of the polygon
27Intensity-Interpolation Surface Rendering Normal vectors at verticesAveraging the normal vectors for each polygon sharing that vertex4321)(Nv+=
28Intensity-Interpolation Surface Rendering Intensity interpolation along scan linesI4 and I5 are interpolated along y-axis, and thenIp is interpolated along x-axisIncremental calculation is also possibley31pscan line452x
30Gouraud Shading Problems Lighting in the polygon interior is inaccurateMach band effectOptical illusion
31Mach Band EffectThese “Mach Bands” are not physically there. Instead, they are illusions due to excitation and inhibition in our neural processingThe bright bands at 45 degrees (and 135 degrees) are illusory.The intensity of each square is the same.
32Mach Band artifact (Gouraud Shading) Mach bands:Caused by interaction of neighboring retinal neurons.Acts as a sort of high-pass filter, accentuating discontinuities in first derivative.Linear interpolation causes first deriv. Discontinuities at polygon edges.
33Normal-Vector Interpolation for Surface Rendering Phong shadingInterpolate normal vectors at polygon verticesProcedureDetermine the average unit normal vector at each vertexLinearly interpolate the vertex normals over the projected area of the polygonApply an illumination model at positions along scan lines to calculate pixel intensitiesn1n2n3
34Gouraud versus Phong Shading Gouraud shading is faster than Phong shadingOpenGL supports Gouraud shadingPhong shading is more accurate
38Texture Mapping Surfaces in real world are very complex Objects have properties that vary across surfaceCannot model all the fine variationsWe need to find ways to add surface detailHow?
39Texture MappingOf course, one can model the exact micro-geometry + material property to control the look and feel of a surfaceBut, it may get extremely costlySo, graphics use a more practical approach – texture mapping
40Texture Mapping Texture Mapping Want a function that assigns a color to each pointThe the surface is a 2D domain, so that is essentially an imagecan represent using any image representationraster texture images are very popular
42OpenGL functions - demo Usually texture is a (2D/3D) image.During initialization read in or create the texture image and place it into the OpenGL state.glTexImage2D (GL_TEXTURE_2D, 0, GL_RGB, imageWidth, imageHeight, 0, GL_RGB, GL_UNSIGNED_BYTE, imageData);Before rendering your textured object, enable texture mapping and tell the system to use this particular texture.glBindTexture (GL_TEXTURE_2D, 13);
43OpenGL functionsDuring rendering, give the cartesian coordinates and the texture coordinates for each vertex.glBegin (GL_QUADS);glTexCoord2f (0.0, 0.0);glVertex3f (0.0, 0.0, 0.0);glTexCoord2f (1.0, 0.0);glVertex3f (10.0, 0.0, 0.0);glTexCoord2f (1.0, 1.0);glVertex3f (10.0, 10.0, 0.0);glTexCoord2f (0.0, 1.0);glVertex3f (0.0, 10.0, 0.0);glEnd ();
44Texture and Texel Each pixel in a texture map is called a Texel Each Texel is associated with a (u,v) 2D texture coordinateThe range of u, v is [0.0,1.0] due to normalization
45(u,v) tupleFor any (u,v) in the range of (0-1, 0-1) multiplied by texture image width and height, we can find the corresponding value in the texture map
46How do we get F(u,v)? We are given a discrete set of values: F[i,j] for i=0,…,N, j=0,…,MNearest neighbor:F(u,v) = F[ round(N*u), round(M*v) ]Linear Interpolation:i = floor(N*u), j = floor(M*v)interpolate from F[i,j], F[i+1,j], F[i,j+1], F[i+1,j]Filtering in general !
48Specifying texture coordinates Texture coordinates needed at every vertexHard to specify by handDifficult to wrap a 2D texture around a 3D object
49We would like a constant cost per pixel Texture FilteringResampling using mip mappingMagnification: InterpolationMinification: AveragingTextureImageTexture => ImageWe would like a constant cost per pixel
50Mip Mapping MIP = Multim In Parvo = Many things in a small place Constructs an image pyramid. Each level is a prefilteredversion of the level below resampled at half the frequency.Whilerasterizing use the level with the sampling rate closest to the desired sampling rate.
51Trilinear interpolation Mip MappingUsed with bilinear/trilinear interpolationRRGBGBTrilinear interpolation
52Mip Mapping - ExampleCourtesy of John hartused to save some of the filtering work needed during texture minification
65Environment MappingEnvironment mapping produces reflections on shiny objectsTexture is transferred in the direction of the reflected ray from the environment map onto the objectReflected ray: R=2(N·V)N-VEnvironment MapViewerReflected rayObject
66Approximations MadeThe map should contain a view of the world with the point of interest on the object as the eyeWe can’t store a separate map for each point, so one map is used with the eye at the center of the objectIntroduces distortions in the reflection, but the eye doesn’t noticeDistortions are minimized for a small object in a large roomThe mapping can be computed at each pixel, or only at the vertices
68Refraction MapsUse texture to represent refraction
69Use texture to represent opacity Opacity MapsUse texture to represent opacityRGB channelsUse the alpha channel to make portions of the texture transparent. Cheaper than explicit modellingalpha channels
70Illumination MapsUse texture to represent illumination footprint
72Does not change silhouette edges Bump MappingUse texture to perturb normalsTextures are treated as height fieldcreates a bump-like effect+=original surfacebump mapmodified surfaceDoes not change silhouette edges
75Bump MappingMany textures are the result of small perturbations in the surface geometryModeling these changes would result in an explosion in the number of geometric primitives.Bump mapping attempts to alter the lighting across a polygon to provide the illusion of texture.
76Bump mapping only affects the normals, Displacement MappingUse texture to displace the surface geometry+=Bump mapping only affects the normals,Displacement mapping changes the entire surface (including the silhouette)