Presentation is loading. Please wait.

Presentation is loading. Please wait.

School of Computer Science University of Seoul. Graphics pipeline Algorithms for the tasks in the pipeline.

Similar presentations


Presentation on theme: "School of Computer Science University of Seoul. Graphics pipeline Algorithms for the tasks in the pipeline."— Presentation transcript:

1 School of Computer Science University of Seoul

2 Graphics pipeline Algorithms for the tasks in the pipeline

3 1. Basic Implementation Strategies 2. Four Major Tasks 3. Clipping 4. Line-Segment Clipping 5. Polygon Clipping 6. Clipping of Other Primitives 7. Clipping in Three Dimensions 8. Rasterization 9. Bresenham’s Algorithm 10. Polygon Rasterization 11. Hidden-Surface Removal 12. Antialiasing 13. Display Considerations

4

5 Input  Geometric objects  Attributes: color, material, normal, etc.  Lights  Camera specifications  Etc. Output  Arrays of colored pixels in the framebuffer

6 Tasks by graphics system  Transformations  Clipping  Shading  Hidden-surface removal  Rasterization

7 for(each_object) render(object); Pipeline renderer Same operation on every primitive (independently, in arbitrary order)  SIMD (Single Instruction, Multiple Data) Cannot handle global calculations (exception: hidden-surface removal)

8 for(each_pixel) assign_a_color(pixel); To determine which geometric primitives can contribute to its color Coherency  incremental implementation Complex data structure Can handle global effects Example: raytracing

9

10 1. Modeling 2. Geometry processing 3. Rasterization 4. Fragment processing

11 Output: set of vertices

12 1. Model-view transformation  To camera (eye) coordinate 2. Projection transformation  To a normalized view volume  Vertices represented in clip coordinate 3. Primitive assembly 4. Clipping 5. Shading  Modified Phong model 6. Perspective division

13 A.k.a. scan conversion “How to approximate a line segment with pixels?” “Which pixels lie inside a 2D polygon?” Viewport transformation Fragments in window coordinates  Vs. screen coordinates?

14 Color assigned by linear interpolation Hidden-surface removal on a fragment-by- fragment basis Blending Antialiasing

15

16 Before perspective division Normalized device coordinates

17

18 Clipper accepts, rejects (or culls), or clips primitives against the view volume Before rasterization Four cases in 2D Two algorithms  Cohen-Sutherland clipping  Liang-Barsky clipping

19 Intersection calculation replaced by membership test  Intersection calculation: FP mul & div  Membership test: FP sub & bit operations Intersection calculation only when needed The whole 2D space decomposed into 9 regions  Membership test against each plane  outcodes computed  “0000” for inside of the volume

20 Four cases associated with the outcodes of endpoints  o1==o2==0: both inside (AB)  o1<>0, o2==0 (or vide versa): one inside and the other outside  intersection calculation required (CD)  o1&o2<>0: outside of the common plane(edge)  can be discarded (EF)  o1&o2==0: outside of the different plane  more computation required (GH & IJ)

21 Works best when many line segments are discarded Can be extended to three dimension Must be recursive

22 Parametric form of line segment Four parameter values computed associated with the intersections with four planes Example   1:bottom,  2: left,  3: top,  4: right  (a) 0<  1<  2<  3<  4<1  (b) 0<  1<  3<  2<  4<1

23 Intersection calculation (against top plane) Simpler form used for clipping decision FP div only when required Multiple shortening not required Not extend to three dimension

24 Line clippingLine clipping by Wikipedia

25

26 Non-rectangular window Shadow generation Hidden-surface removal Antialiasing

27 Clipping concave polygon is complex Clipping convex polygon is easy  single clipped polygon Concave polygon is tessellated into convex polygons

28 Sutherland-Hodgeman Any line segment clipper can be applied  blackboxed Convex polygon (including rectangle) as the intersection of half-spaces Intersection test against each plane

29 Pipelined

30

31 Early clipping can improve performance  bounding boxes & volumes  AABB (Axis-Aligned Bounding Box)  Bounding sphere  OBB (Oriented Bounding Box)  DOP (Discrete Oriented Polytop)  Convex hull  …and many more

32 (image courtesy of http://www.ray-tracing.ru)http://www.ray-tracing.ru

33 Approximated with line segments (or triangles/quads) “Convex hull property” for parametric curves & surfaces Texts  Texts as bit patterns  clipping in framebuffer  Texts as geometric objects  polygon clipping  OpenGL allows both Scissoring: clipping in the framebuffer

34

35 Clipping against 3D view volume Extension of Cohen-Sutherland

36 Extension of Liang-Barsky Intersection calculation is simple due to normalization Additional clipping planes with arbitrary orientations supported

37

38 Square-shaped Integer coordinates In OpenGL center is located at the halfway between integers

39 Rasterization of line segment Only for small slopes FP addition for each pixel

40

41 No FP calculation! Standard algorithm For integer endpoints (x1,y1)-(x2,y2)

42 How it works:  With slope 0<=m<=1  Assume we just colored the pixel (i+1/2,j+1/2)  We need to color either (i+3/2,j+1/2) or (i+3/2,j+3/2) depending on d=a-b  (x2-x1)(a-b) is integer  simpler calculation  d can be computed incrementally (next page)

43  d can be computed incrementally  If a_k > b_k (left)  a_{k+1}+m=a_k  a_{k+1}=a_k-m  b_k = b_{k+1}-m  b_{k+1}=b_k+m  If a_k<b_k (right)  1+a_k=a_{k+1}+m  a_{k+1}=a_k-(m-1)  1-b_k=m-b_{k+1}  b_{k+1}=b_k+(m-1)

44

45 Inside-outside testing  Crossing (or odd-even test)  Winding test – how to compute? Winding test

46 Supported by GLU functions  Triangles generated based on given contour  Different tessellation depending on the winding number (gluTessProperty)gluTessProperty

47 “How to fill the interior of a polygon?” Three algorithms  Flood fill – starts with “seed point”  Scanline fill  Odd-Even fill Singularity 1. Handle separately 2. Perturb the position 3. Different values for pixels and vertices

48

49 Object-space approach  For each object, determine & render the visible parts  Pairwise comparison  O(k^2) Image-space approach  For each pixel, determine the closest polygon  O(k)

50 “Spans” processed independently for lighting and depth calculations Overhead to generate spans  y-x algorithm

51 A.k.a. Culling Fast calculation in normalized view volume OpenGL back-face culling  glCullFace(face) & glEnable(GL_CULL_FACE) glCullFace  Culling by signed area in window coordinates (WHY?)

52 Most widely used (including OpenGL) Works in image space Depth information for each fragment  stored in the “depth buffer” (or “z-buffer”) Inaccurate depth for perspective after normalization, but ordering preserved Depth can be computed incrementally

53 Scan conversion with the z-buffer: three tasks simutaneously  Orthographic projection  Hidden-surface removal  Shading

54 ??

55

56 Shades each pixel by the percentage of the ideal line that crosses it  antialiasing by area averaging Polygons sharing a pixel  can be handled using accumulation buffer

57 Time-domain (temporal) aliasing  Small moving objects can be missed  More and one ray per pixel

58

59 Range of colors (gamut) they display differ How they map SW-defined colors to the values of the primaries for the display differ The mapping between brightness values defined by the program and what is displayed is nonlinear

60 Basic assumption: three color values that we determine for each pixel correspond to the tristimulus values  RGB system Problem with RGB system  Range of displayable colors (color gamut) is different for each medium (film or CRT)  device independent graphics  Color-conversion matrix  Supported by OpenGL

61 Problems with color-conversion approach  Color gamut of different systems may not be the same  Conversion between RGB and CMYK is hardCMYK  Distance between colors in the color cube is not a measure of how far apart the colors are perceptually

62 Fraction of each color in the three primaries t1+t2+t3=1  the last value is implicit  can be plotted in 2D T1+T2+T3 is the intensity

63 Hue-Saturation-Lightness  Hue: color vector direction  Saturation: how far the given color is from the diagonal  Lightness: how far the given color is from the origin RGB color in polar coordinates

64 Human visual system perceives intensity in a logarithmic manner For uniformly spaced brightness, the intensities needs to be assigned exponentially

65 Trade-off between spatial resolution with grayscale (or color) precision DitheringDithering may introduce MoireMoire


Download ppt "School of Computer Science University of Seoul. Graphics pipeline Algorithms for the tasks in the pipeline."

Similar presentations


Ads by Google