Presentation is loading. Please wait.

Presentation is loading. Please wait.

Week 8 - Monday.  What did we talk about last time?  Workday  Before that:  Image texturing ▪ Magnification ▪ Minification  Mipmapping  Summed area.

Similar presentations


Presentation on theme: "Week 8 - Monday.  What did we talk about last time?  Workday  Before that:  Image texturing ▪ Magnification ▪ Minification  Mipmapping  Summed area."— Presentation transcript:

1 Week 8 - Monday

2  What did we talk about last time?  Workday  Before that:  Image texturing ▪ Magnification ▪ Minification  Mipmapping  Summed area tables

3

4

5

6  Typically a chain of mipmaps is created, each half the size of the previous  That's why cards like square power of 2 textures  Often the filtered version is made with a box filter, but better filters exist  The trick is figuring out which mipmap level to use  The level d can be computed based on the change in u relative to a change in x

7  One way to improve quality is to interpolate between u and v texels from the nearest two d levels  Picking d can be affected by a level of detail bias term which may vary for the kind of texture being used

8  Sometimes we are magnifying in one axis of the texture and minifying in the other  Summed area tables are another method to reduce the resulting overblurring  It sums up the relevant pixels values in the texture  It works by precomputing all possible rectangles

9  Summed area tables work poorly for non-rectangular projections into texture space  Modern hardware uses unconstrained anisotropic filtering  The shorter side of the projected area determines d, the mipmap index  The longer side of the projected area is a line of anisotropy  Multiple samples are taken along this line  Memory requirements are no greater than regular mipmapping

10

11  Image textures are the most common, but 3D volume textures can be used  These textures store data in a (u, v, w) coordinate space  Even volume textures can be mipmapped  Quadrilinear interpolation!  In practice, volume textures are usually used for fog, smoke, or explosions  3D effects that are inconsistent over the volume

12  A cube map is a kind of texture map with 6 faces  Cube maps are used to texture surfaces based on direction  They are commonly used in environment mapping  A ray is made from the center of the cube out to the surface  The component with the largest magnitude selects which of the 6 faces  The other components are used for (u,v) coordinates  Cube maps can cause awkward seams when jumping between faces

13  You will never need to worry about this in this class, but texture memory space is a huge problem  There are many different caching strategies, similar ones used for RAM:  Least Recently Used (LRU): Swap out the least recently used texture, very commonly used  Most Recently Used (MRU): Swap out the most recently used texture, use only during thrashing  Prefetching can be useful to maintain consistent frame rates

14  JPEG and PNG are common compression techniques for regular images  In graphics hardware, these are too complicated to be decoded on the fly  That's why the finished SharpDX projects have pre- processed.tkb files  Most DirectX texture compression divides textures into 4 x 4 tiles  Two 16-bit RGB values are recorded for each tile  Each texel uses 2 bits to select one of the two colors or two interpolated values between them

15  Ericsson texture compression (ETC) is used in OpenGL  It breaks texels into 2 x 4 blocks with a single color  It uses per-pixel luminance information to add detail to the blocks  Normal maps (normals stored as textures) allow for interesting compression approaches  Only x and y components are needed since the z component can be calculated  The x and y can then be stored using the BC5 format for two channels of color data

16  A procedural texture is made by computing a function of u and v instead of looking up a texel in an image  Noise functions are often used to give an appearance of randomness  Volume textures can be generated on the fly  Values can be returned based on distance to certain feature points (redder colors near heat, for example)

17  Textures don't have to be static  The application can alter them over time  Alternatively, u and v values can be remapped to make the texture appear to move  Matrix transformations can be used for zoom, rotation, shearing, etc.  Video textures can be used to play back a movie in a texture  Blending between textures can allow an object to transform like a chameleon

18  The lighting we have discussed is based on material properties  Diffuse color  Specular color  Smoothness coefficient m  A texture can be used to modify these values on a per-pixel basis  A normal image texture can be considered a diffuse color map  One that affects specular colors is a specular color map (usually grayscale)  One that affects m is a gloss map

19  Alpha values allow for interesting effects  Decaling is when you apply a texture that is mostly transparent to a (usually already textured) surface  Cutouts can be used to give the impression of a much more complex underlying polygon  1-bit alpha doesn't require sorting  Cutouts are not always convincing from every angle

20

21

22  Bump mapping refers to a wide range of techniques designed to increase small scale detail  Most bump mapping is implemented per- pixel in the pixel shader  3D effects of bump mapping are greater than textures alone, but less than full geometry

23  Macro-geometry is made up of vertices and triangles  Limbs and head of a body  Micro-geometry are characteristics shaded in the pixel shader, often with texture maps  Smoothness (specular color and m parameter) based on microscopic smoothness of a material  Meso-geometry is the stuff in between that is too complex for macro-geometry but large enough to change over several pixels  Wrinkles  Folds  Seams  Bump mapping techniques are primarily concerned with mesoscale effects

24  James Blinn proposed the offset vector bump map or offset map  Stores b u and b v values at each texel, giving the amount that the normal should be changed at that point  Another method is a heightfield, a grayscale image that gives the varying heights of a surface  Normal changes can be computed from the heightfield

25  The results are the same, but these kinds of deformations are usually stored in normal maps  Normal maps give the full 3- component normal change  Normal maps can be in world space (uncommon)  Only usable if the object never moves  Or object space  Requires the object only to undergo rigid body transforms  Or tangent space  Relative to the surface, can assume positive z  Lighting and the surface have to be in the same space to do shading  Filtering normal maps is tricky

26  Bump mapping doesn't change what can be seen, just the normal  High enough bumps should block each other  Parallax mapping approximates the part of the image you should see by moving from the height back to the view vector and taking the value at that point  The final point used is:

27  At shallow viewing angles, the previous approximation can look bad  A small change results in a big texture change  To improve the situation, the offset is limited (by not scaling by the z component)  It flattens the bumpiness at shallow angles, but it doesn't look crazy  New equation:

28  The weakness of parallax mapping is that it can't tell where it first intersects the heightfield  Samples are made along the view vector into the heightfield  Three different research groups proposed the idea at the same time, all with slightly different techniques for doing the sampling  There is still active research here  Polygon boundaries are still flat in most models

29  Yet another possibility is to change vertex position based on texture values  Called displacement mapping  With the geometry shader, new vertices can be created on the fly  Occlusion, self-shadowing, and realistic outlines are possible and fast  Unfortunately, collision detection becomes more difficult

30

31  Radiometry  Photometry  Colorimetry  BRDFs

32  Start reading Chapter 7  Finish Project 2  Due on Friday


Download ppt "Week 8 - Monday.  What did we talk about last time?  Workday  Before that:  Image texturing ▪ Magnification ▪ Minification  Mipmapping  Summed area."

Similar presentations


Ads by Google