Presentation is loading. Please wait.

Presentation is loading. Please wait.

The Graphics Rendering Pipeline

Similar presentations


Presentation on theme: "The Graphics Rendering Pipeline"— Presentation transcript:

1 The Graphics Rendering Pipeline
CSC4820/6820 Computer Graphics Algorithms Ying Zhu Georgia State University The Graphics Rendering Pipeline

2 What is graphics rendering pipeline?
A process of generating a 2D image, given a virtual camera (eye), 3D objects, light sources, textures, etc. The 2D image is a frame of animation. A process for converting coordinates from its local 3D coordinate system to the final 2D window coordinate system. Both OpenGL and Direct3D implement part of the rendering pipeline.

3 Graphics Rendering Pipeline Stages
The rendering pipeline is normally divided into 3 conceptual stages Application Geometry Rasterizer The output of one stage is the input of next stage, thus the name “pipeline” Each stage also contains sub- stages Application Geometry Rasterizer

4 OpenGL pipeline (overview)

5 OpenGL pipeline (overview)
Application Rasterizer Geometry

6 OpenGL pipeline (detailed view)

7 Why use pipeline architecture?
It’s more efficient Think automobile assembly line Explore data parallelism Data parallelism vs. instruction parallelism The pipeline architecture is the fundamental architecture for both 3D graphics software and hardware Think GPU as an image assembly line

8 Application Stage Overview
3D objects are created using modeling tools 3D Studio Max, Maya, etc. 3D objects are positioned in 3D world and organized in a scene graph 3D objects can be moved by transformation Transformation are performed on vertices using different types of transformation matrices 3D transformations are specified in this stage but carried out in the Geometry stage

9 Application Stage Overview
Other tasks in the application stage: Handling user input level of details occlusion culling The result: the 3D geometry objects to be fed to the next stage Points, lines, polygons, etc. 3D geometry objects are eventually broken down to vertices, represented by (x, y, z) coordinates

10 Geometry Stage Overview
Responsible for the majority of per-polygon or per vertex operations Contains several sub-stages Model Transformation View Transformation (often combined with model transformation) Lighting Projection Clipping Screen Mapping

11 Coordinate Systems Multiple Cartesian coordinate systems are used at different stages of the pipeline 3D Model Coordinates (model space) Each model is in its own coordinate system with origin in some point on the model 3D World Coordinates (world space) Unified world coordinate system, with only one origin 3D Eye Coordinates (view space) Camera (eye) is the origin and look straight down Z-axis 2D Screen Coordinates (screen space) 3D coordinates are converted to 2D screen coordinates for display

12 3D Graphics Pipeline Data Flow
Objects in the 3D scene are sequentially transformed through different coordinate systems when proceeding the 3D pipeline.

13 Model Transform Transform the vertices and normals of 3D objects from model space to world space After the transform, all objects exit in the same coordinate system The camera (eye) has a location in world coordinate system and a direction

14 View Transform The camera and all the 3D objects are transformed with the view transform The purpose is to place the camera at the origin and make it look in the direction of Z-axis. Often combined with model transform – model-view transform

15 Projection Transform the view volume into a unit cube with its extreme points at (-1, -1, -1) and (1, 1, 1). The purpose is to simplify clipping. Two projection methods: Orthographic projection When realism is not a concern. Perspective projection When realism counts.

16 Lighting A lighting equation is used to calculate a color at each vertex of the 3D object that is to be affected by lighting. The lighting equations often have little to do with how lights behave in the real world. Per-vertex lighting vs. per-pixel lighting

17 Shading Each 3D object surface is divided into many triangles.
The colors at the vertices of a triangle are interpolated over the triangle. Three shading methods: Flat shading (operate per triangle) Gouraud Shading (operate per vertex) Phong Shading (operate per pixel) Programmable Shaders

18 Clipping Only the primitives wholly or partially inside the view volume need to be passed on to the rasterizer stage. Primitives (line or polygon) that are partially inside the view volume require clipping.

19 Screen Mapping Primitives that survive the clipping are passed on to screen mapping stage. X and Y coordinates of each primitive are transformed to screen coordinates. Z coordinates are not affected and are kept for depth buffer checking.

20 Rasterizer Stage Overview
In this stage, all primitives are converted into pixels in the window. The goal of this stage is to assign correct colors to the pixels. It contains one or all of the following sub-stages: Texture mapping Fog Translucency Test Depth buffering Antialiasing

21 Rasterization

22 Texturing Texture mapping means “attaching” an image onto a 3D object.
Textures are important to bringing realism to a 3D scene. Provide surface details Add scene depth Advance texture mapping Multi-texturing Bump mapping Environment mapping Pixel shader

23 Fog An optional stage but help set the mood
Also helps give a scene an addition sense of depth of field Added realism E.g. allow distant objects to gracefully "fade away" rather than just pop out of the scene

24 Translucency Tests Also called Alpha Test
Used to create effects such as glass, water, see-through views in CAD modeling, and translucent or transparent materials.

25 Depth Buffering Also called Z-buffer algorithm
Used for visibility check When multiple primitives are rendered to a certain pixel, the color of the pixel is the color of the primitive that is closest to the point of view of camera. The only exception is when primitives are transparent. This algorithm allows the primitives to be rendered in any order.

26 Anti-aliasing Alias effect: the jagged or stair-stepped look of diagonal or curved lines in computer graphics. Anti-aliasing algorithms sample, or examine and evaluate, the colors and shades of pixels adjoining curved or diagonal lines to present smoother looking line.

27 Display Finally, the final 2D image is generated and rendered to frame buffer and displayed on screen. Double-buffer technique may be used to reduce flashing Front buffer for display, back buffer for rendering next frame. A long trip down the graphics rendering pipeline If application runs at 60 frame/second, this whole process will repeat every 17 ms.

28 Summary The 3D graphics pipeline is the underlying tool for graphics rendering. Three major stages: Application Geometry Rasterizer


Download ppt "The Graphics Rendering Pipeline"

Similar presentations


Ads by Google