Presentation is loading. Please wait.

Presentation is loading. Please wait.

Week 7 - Monday.  What did we talk about last time?  Specular shading  Aliasing and antialiasing.

Similar presentations


Presentation on theme: "Week 7 - Monday.  What did we talk about last time?  Specular shading  Aliasing and antialiasing."— Presentation transcript:

1 Week 7 - Monday

2  What did we talk about last time?  Specular shading  Aliasing and antialiasing

3

4

5

6

7  Partially transparent objects significantly increase the difficulty of rendering a scene  We will talk about really difficult effects like frosted glass or light bending later  Just rendering transparent objects at all is a huge pain because the Z-buffer doesn't work anymore  Workarounds:  Screen door transparency  Sorting  Depth peeling

8  We render an object with a checkerboard pattern of holes in it, leaving whatever is beneath the object showing through  Problems:  It really only works for 50% transparent objects  Only one overlapping transparent object really works  But it is simple and inexpensive

9  Most transparency methods use the over operator, which combines two colors using the alpha of the one you're putting on top  c 0 = α s c s + (1 - α s )c d  c s is the new (source) color  c d is the old (destination) color  c o is the resulting (over) color  α s is the opacity (alpha) of the object

10  The over operator is order dependent  To render correctly we can do the following:  Render all the opaque objects  Sort the centroids of the transparent objects in distance from the viewer  Render the transparent objects in back to front order  To make sure that you don't draw on top of an opaque object, you test against the Z-buffer but don't update it

11  It is not always possible to sort polygons  They can interpenetrate  Hacks:  At the very least, use a Z-buffer test but not replacement  Turning off culling can help  Or render transparent polygons twice: once for each face

12  It is possible to use two depth buffers to render transparency correctly  First render all the opaque objects updating the first depth buffer  Make second depth buffer maximally close  On the second (and future) rendering passes, render those fragments that are closer than the z values in the first depth buffer but further than the value in the second depth buffer  Update the second depth buffer  Repeat the process until no pixels are updated

13 1 layer2 layers 3 layers4 layers

14  Alpha values can be used for antialiasing, by lowering the opacity of edges that partially cover pixels  Additive blending is an alternative to the over operator  c 0 = α s c s + c d  This is only useful for effects like glows where the new color never makes the original darker  Unlike transparency, it can be applied in any order

15

16  I don't want to go deeply into gamma  The trouble is that real light has a wide range of color values that we need to store in some limited range (such as 0 – 255)  Then, we have to display these values, moving back from the limited range to the "real world" range

17  Physical computations should be performed in the linear (real) space  To convert that linear space into nonlinear frame buffer space, we have to raise values by a power, typically 0.45 for PCs and 0.55 for Macs  Each component of physical color (0.3, 0.5, 0.6) is raised to 0.45 giving (0.582, 0.732, 0.794) then scaled to the 0-255 range, giving (148, 187, 203)

18  Usually, gamma correction is taken care of for you  If you are writing something where you need to do computations in the "real life" color space (such as a raytracer), you may have to worry about it  Calculations in the wrong space can have visually unrealistic effects

19

20  We've got polygons, but they are all one color  At most, we could have different colors at each vertex  We want to "paint" a picture on the polygon  Because the surface is supposed to be colorful  To appear as if there is greater complexity than there is (a texture of bricks rather than a complex geometry of bricks)  To apply other effects to the surface such as changes in material or normal

21  We never get tired of pipelines  Go from object space to parameter space  Go from parameter space to texture space  Get the texture value  Transform the texture value Projector function Object space Corresponder function Parameter space Obtain value Texture space Value transform function Texture value Transformed value

22  The projector function goes from the model space (a 3D location on a surface) to a 2D (u,v) coordinate on a texture  Usually, this is based on a map from the model to the texture, made by an artist  Tools exist to help artists "unwrap" the model  Different kinds of mapping make this easier  In other scenarios, a mapping could be determined at run time

23  From (u,v) coordinates we have to find a corresponding texture pixel (or texel)  Often this just maps directly from u,v  [0,1] to a pixel in the full width, height range  But matrix transformations can be applied  Also, values outside of [0,1] can be given, with different choices of interpretation

24  Usually the texture value is just an RGB triple (or an RGBα value)  But, it could be procedurally generated  It could be a bump mapping or other surface data  It might need some transformation after retrieval

25

26

27  Image texturing techniques  Procedural texturing

28  Keep working on Project 2  Keep reading Chapter 6  Mipmapping  Anisotropic filtering


Download ppt "Week 7 - Monday.  What did we talk about last time?  Specular shading  Aliasing and antialiasing."

Similar presentations


Ads by Google