Presentation is loading. Please wait.

Presentation is loading. Please wait.

CSc4820/6820 Computer Graphics Algorithms Ying Zhu Georgia State University Texture Mapping.

Similar presentations


Presentation on theme: "CSc4820/6820 Computer Graphics Algorithms Ying Zhu Georgia State University Texture Mapping."— Presentation transcript:

1 CSc4820/6820 Computer Graphics Algorithms Ying Zhu Georgia State University Texture Mapping

2 2 Outline Texture mapping concepts Creating textures Loading textures Texture objects Texture parameterization Texture filtering, wrapping, etc. Advanced texture mapping techniques –Multi-texturing –Bump mapping –Environment mapping –Light maps

3 3 What is texture mapping? Texture mapping is the method of taking a flat 2D image of what an object's surface looks like, and then applying that flat image to a 3D computer generated object. –Much in the same way that you would hang wallpaper on a blank wall. Texture mapping brought computer graphics to a new level of realism. Makes a surface look textured even though geometrically it isn’t.

4 4 Different types of texture mapping Texture Mapping –Uses images to assign or modulate colors for pixels Environment mapping (reflection mapping) –Uses a picture of the environment for texture maps –Allows simulation of highly specular surfaces Bump mapping –Actually a per-pixel lighting technique –Save normals for each pixel in the format of a texture image (called normal map) –During runtime, get normal for each pixel from normal map –Lighting calculation for each pixel

5 5 Texture Mapping geometric model texture mapped

6 6 Texture mapping

7 7 Environment Mapping

8 8 Bump Mapping

9 9 Where does texture mapping fit into 3D pipeline? Mapping techniques are implemented at the rasterizer stage of the rendering pipeline –After scan conversion and before scissor test, alpha test, depth test, etc. –Texture mapping is performed per pixel –Efficient because few polygons pass down the geometric pipeline

10 10 Where does texture mapping fit into 3D pipeline?

11 11 Texture Mapping Steps Creating the texture Loading the texture Declare texture image (OpenGL) Enable texturing (OpenGL) Specify texture parameters (OpenGL) –wrapping, filtering Assign texture coordinates to vertices (OpenGL) –Proper mapping function is left to application

12 12 Photo Textures There are lots of free textures on the web: –http://astronomy.swin.edu.au/~pbourke/texture/http://astronomy.swin.edu.au/~pbourke/texture –http://www.3dlinks.com/textures_free.cfmhttp://www.3dlinks.com/textures_free.cfm

13 13 Other Methods Use frame buffer as the source of texture Uses the current frame buffer as a source image –glCopyTexImage2D()

14 14 Loading Textures OpenGL does not provide any functions to read from files. You’ll have to write your own texture loader. –Load an image file (e.g. JPEG) into memory –Define a pointer to the image data –Pass it to OpenGL using glTexImage2D() Use open source libraries –http://openil.sourceforge.net/http://openil.sourceforge.net/

15 15 Parameterization Determine the texture coordinates for each vertex The process of mapping the texture image onto geometry

16 16 Mapping Functions Consider mapping from texture coordinates to a point a surface Given a texel (s, t) in the texture image, which vertex should this texel be assigned to? –How to map (s, t) to a (x, y, z)? Appear to need three functions x = X(s,t) y = Y(s,t) z = Z(s,t) This is usually difficult s (x,y,z) t

17 17 Backward Mapping We usually reverse the question Given a vertex (x, y, z) on the 3D model, how to find its texture coordinate (s, t)? –How to map a given (x, y, z) to a (s, t)? Need to find two functions: s = S(x,y,z) t = T(x,y,z) Such functions are difficult to find in general –Much research has been devoted to this subject

18 18 Parameterization Automatic texture coordinate generation is still a difficult problem OpenGL provides some texture coordinate generation functions –Helpful in some simple cases, but far from a general solution –Developers usually have to figure out texture coordinates themselves Can use tools like Maya or 3DS Max to apply textures to objects –Optimal texture coordinate assignments are obtained by trial and error –Texture coordinates stored in Maya or 3DS files

19 19 Call glTexCoord*() to specify texture coordinates at each vertex s t 1, 1 0, 1 0, 01, 0 (s, t) = (0.2, 0.8) (0.4, 0.2) (0.8, 0.4) A BC a b c Texture SpaceObject Space Specify texture coordinates

20 20 Texture Parameters OpenGL has a variety of parameters that determine how texture is applied –Wrapping parameters determine what happens of s and t are outside the (0,1) range –Filter modes allow us to use area averaging instead of point samples –Mipmapping allows us to use textures at multiple resolutions –Environment parameters determine how texture mapping interacts with shading

21 21 Wrapping Mode Clamping: if s,t > 1 use 1, if s,t <0 use 0 Wrapping: use s,t modulo 1 (integer part ignored) texture s t GL_CLAMP wrapping GL_REPEAT wrapping

22 22 Why texture filtering? Eventually texture images will be mapped to screen pixel regions. Texture images may have more texels than the pixels in the screen space. –A single pixel can map to more than one texel Texture images have fewer texels than the pixels in the screen space –Multiple pixels map to a single texel

23 23 Why texture filtering? If texture map is too small compared to the pixel are being mapped, the same texel is mapped to adjacent pixels. –Cause a blockiness effect If texture image has more samples than the pixel are to be applied, multiple texels can be mapped to the same pixel. –Then the program has to pick one texel. –It’s algorithm-dependent and result in artifacts such as texture swimming and pixel-popping.

24 24 Aliasing and Anti-aliasing The problem we just discussed is part of a bigger problem called aliasing. Aliasing is a fundamental issue in computer graphics. For example, smooth curves and other lines become jagged because the resolution of the graphics device or file is not high enough to represent a smooth curve. Anti-aliasing is the technology that reduces the aliasing effect.

25 25 Why does aliasing occur? Compute screen is a discrete pixel grid. We need to discretely sampling a continuous signal/function/object (e.g. image) and re- construct the information in the discrete pixel grid. Aliasing happens if we do not sample the continuous signal/function/object (e.g. image) densely enough. We are not capturing the image details with our discrete pixel grid.

26 26 When does aliasing occur? Motion - popping/flashing/strobing Edges - jaggies Textures - crawling/ Moire patterns Missing details in textures, geometry

27 27 Aliasing Example

28 28 Aliasing Example

29 29 Aliasing and Anti-aliasing Example Unfiltered Texture Mapping Filtered Texture Mapping

30 30 Texture Filtering Texture filtering is one of the techniques to minimize the aliasing effect caused by an insufficient texture-sampling rate. Different texture filtering methods: –Point sampling –Bilinear filtering –Trilinear MIP-mapping –Anisotropic filtering

31 31 Magnification The magnification technique is used when multiple pixels can map to a single texel, and it maps a single texel to multiple pixels. This happens when you zoom really close into a texture mapped polygon or due to perspective projection.

32 32 Minification The minification algorithm is used in the case where multiple texels can map to a single pixel, and it selects the best fit texel from among the group texels that could map to the pixel. This happens when you zoom out or due to perspective foreshortening.

33 33 Magnification and Minification TexturePolygon MagnificationMinification PolygonTexture More than one texel can cover a pixel (minification) or more than one pixel can cover a texel (magnification) Can use point sampling (nearest texel) or linear filtering ( 2 x 2 filter) to obtain texture values

34 34 Nearest Neighbor Interpolation A kind of point sampling method. For each pixel, grabs the texture sample from the texture map that has u, v coordinates that map nearest to the pixel's coordinates (the pixel center), and applies it to the pixel. Pro: requires the least amount of memory bandwidth in terms of the number of texels that need to be read from texture memory (one per pixel). Con: the result often causes artifacts as we discussed above due to insufficient samples (screen pixels) to describe the texture.

35 35 Bi-linear Filtering Bilinear filtering reads the four samples nearest to the pixel center, and uses a weighted average of those color values as the final texture color value. The weights are based on the distance from the pixel center to the four texel centers. Pro: blur out a good deal of the texture artifacts seen with point sampling, Con: it's only a four-tap filter working with a single texture map, its effectiveness is limited.

36 36 Bi-linear Filtering OpenGL provides bi-linear filtering.

37 37 MIP-Mapping Performing filtering during texture mapping can be expensive. –Averaging covered texels can be very expensive –For every pixel, might have to visit O(n) texels –Can hurt performance A solution is to use pre-filtering of texture before rendering. Mip-mapping is a pre-filtering technique.

38 38 MIP-Mapping Basic idea is to make multiple copies of the original texture, and each successive MIP-map is exactly half the resolution of the previous one.

39 39 MIP-Mapping Consider this as a kind of 3D texture, wherein you have the typical two coordinates, (u, v), but now a third coordinate, s, is used to measure which MIP-map (or maps) to select based on which map resolution will most closely match the pixel area to be mapped. As the s coordinate increases, smaller and smaller MIP-maps are used.

40 40 How to select s? The derivation of the d coordinate is a bit complicated and implementation dependent. To put it simple, the MIP-map with the smallest amount of texture magnification and minification would be selected.

41 41 Storing MipMaps One convenient way to store mipmap is to put them in one big image.

42 42 MIP-Mapping Methods Bilinear MIP-Mapping –Apply bilinear filtering on the selected MIP-Map Trilinear MIP-Mapping –Uses a weighted average of two bilinear samples from the two MIP-maps nearest to the pixel. Anisotropic filtering –Supported by newer graphics cards. –More sophisticated method that takes up to 16 samples. –Best image quality but uses lots of texture memory.

43 43 Example point sampling mipmapped point sampling mipmapped linear filtering linear filtering

44 44 Multi-texturing Apply multiple texture images to one object. OpenGL and D3D allow for a single vertex to store two or more texture addresses. Can be used to create effects such as light maps, bump mapping, etc. Hardware support –Parallel pixel pipelines –Processing multiple texels per pixel per clock

45 45 Example of Multitexturing Keep one texture of the jacket, and one texture as a single bullet hole. With multitexturing, you can load a single instance of the bullet hole, and then place it multiple times on the main vest texture.

46 46 Light Maps Another lighting trick that uses multitexturing (first used in Quake). Combine a base texture and a light map to create elaborate lighting effects. Avoid doing the actual lighting calculations for all the lights in a scene. Used when lights and objects are fixed in space. +=

47 47 Environment Mapping Allows for surrounding environment to be reflected on an object without modeling the physics. Map the world surrounding an object onto a cube or sphere. This is called environment map. Project the cube/sphere onto the object.

48 48 Environment Mapping During the shading calculation, use view reflection vector as index into texture map –Bounce a ray from the viewer off the object (at point P) –Intersect the ray with the environment map (the cube) at point E. –Get the environment map’s color at E and illuminate P as if there was a virtual light source at position E. –You see an image of the environment reflected on shiny surfaces.

49 49 Environment Mapping Example

50 50 Bump Mapping How do you make a surface look rough? –Option1: model the surface with many small polygons. –Options2: perturb the normal vectors before the shading calculation.

51 51 Bump Mapping The surface doesn’t actually change but shading makes it look that way. Bump map fakes small displacements above or below the true surface. Take advantage of multitexturing: one base texture image + one normal map. For the math behind it all, look at Ed Angel book 7.8.

52 52 Bump Mapping Example Front view Side view

53 53 Summary Texture mapping –Texture image –Texture objects –Texture coordinates –Texture filtering, wrapping, environments Multi-texturing Texture mapping tricks –Light maps –Environment mapping –Bump mapping


Download ppt "CSc4820/6820 Computer Graphics Algorithms Ying Zhu Georgia State University Texture Mapping."

Similar presentations


Ads by Google