Computer Graphics Module Review CO2409 Computer Graphics.

Slides:



Advertisements
Similar presentations
Real-Time Rendering 靜宜大學資工研究所 蔡奇偉副教授 2010©.
Advertisements

Exploration of advanced lighting and shading techniques
CS123 | INTRODUCTION TO COMPUTER GRAPHICS Andries van Dam © 1/16 Deferred Lighting Deferred Lighting – 11/18/2014.
Understanding the graphics pipeline Lecture 2 Original Slides by: Suresh Venkatasubramanian Updates by Joseph Kider.
Graphics Pipeline.
Texture Mapping. Texturing  process that modifies the appearance of each point on a surface using an image or function  any aspect of appearance can.
Week 7 - Monday.  What did we talk about last time?  Specular shading  Aliasing and antialiasing.
Texture and Colour in Virtual Worlds Programming for 3D Applications.
3D Graphics Rendering and Terrain Modeling
Real-Time Rendering TEXTURING Lecture 02 Marina Gavrilova.
Real-Time Rendering SPEACIAL EFFECTS Lecture 03 Marina Gavrilova.
Computer Graphics - Class 10
CGDD 4003 THE MASSIVE FIELD OF COMPUTER GRAPHICS.
(conventional Cartesian reference system)
IAT 3551 Computer Graphics Overview Color Displays Drawing Pipeline.
The Graphics Pipeline CS2150 Anthony Jones. Introduction What is this lecture about? – The graphics pipeline as a whole – With examples from the video.
Game Engine Design ITCS 4010/5010 Spring 2006 Kalpathi Subramanian Department of Computer Science UNC Charlotte.
University of Texas at Austin CS 378 – Game Technology Don Fussell CS 378: Computer Game Technology Beyond Meshes Spring 2012.
SET09115 Intro Graphics Programming
Shading (introduction to rendering). Rendering  We know how to specify the geometry but how is the color calculated.
University of Illinois at Chicago Electronic Visualization Laboratory (EVL) CS 426 Intro to 3D Computer Graphics © 2003, 2004, 2005 Jason Leigh Electronic.
Technology and Historical Overview. Introduction to 3d Computer Graphics  3D computer graphics is the science, study, and method of projecting a mathematical.
Programmable Pipelines. Objectives Introduce programmable pipelines ­Vertex shaders ­Fragment shaders Introduce shading languages ­Needed to describe.
CSE 381 – Advanced Game Programming Basic 3D Graphics
Geometric Objects and Transformations. Coordinate systems rial.html.
Computer Graphics Texture Mapping
Programmable Pipelines. 2 Objectives Introduce programmable pipelines ­Vertex shaders ­Fragment shaders Introduce shading languages ­Needed to describe.
Week 2 - Wednesday CS361.
Computer Graphics World, View and Projection Matrices CO2409 Computer Graphics Week 8.
Chris Kerkhoff Matthew Sullivan 10/16/2009.  Shaders are simple programs that describe the traits of either a vertex or a pixel.  Shaders replace a.
09/09/03CS679 - Fall Copyright Univ. of Wisconsin Last Time Event management Lag Group assignment has happened, like it or not.
TERRAIN SET09115 Intro to Graphics Programming. Breakdown  Basics  What do we mean by terrain?  How terrain rendering works  Generating terrain 
CS 450: COMPUTER GRAPHICS REVIEW: INTRODUCTION TO COMPUTER GRAPHICS – PART 2 SPRING 2015 DR. MICHAEL J. REALE.
Advanced Computer Graphics Depth & Stencil Buffers / Rendering to Textures CO2409 Computer Graphics Week 19.
Emerging Technologies for Games Alpha Sorting and “Soft” Particles CO3303 Week 15.
Computer Graphics Bitmaps & Sprites CO2409 Computer Graphics Week 3.
Computer Graphics Soft Body Animation - Skinning CO2409 Computer Graphics Week 22.
3D Graphics for Game Programming Chapter IV Fragment Processing and Output Merging.
Computer Graphics The Rendering Pipeline - Review CO2409 Computer Graphics Week 15.
COMPUTER GRAPHICS CSCI 375. What do I need to know?  Familiarity with  Trigonometry  Analytic geometry  Linear algebra  Data structures  OOP.
Advanced Computer Graphics Advanced Shaders CO2409 Computer Graphics Week 16.
GRAPHICS PIPELINE & SHADERS SET09115 Intro to Graphics Programming.
CHAPTER 8 Color and Texture Mapping © 2008 Cengage Learning EMEA.
Games Development 1 Camera Projection / Picking CO3301 Week 8.
Advanced Computer Graphics Shadow Techniques CO2409 Computer Graphics Week 20.
09/16/03CS679 - Fall Copyright Univ. of Wisconsin Last Time Environment mapping Light mapping Project Goals for Stage 1.
Computer Graphics Rendering 2D Geometry CO2409 Computer Graphics Week 2.
Emerging Technologies for Games Deferred Rendering CO3303 Week 22.
Computer Graphics Blending CO2409 Computer Graphics Week 14.
Review on Graphics Basics. Outline Polygon rendering pipeline Affine transformations Projective transformations Lighting and shading From vertices to.
Maths & Technologies for Games Advanced Graphics: Scene Post-Processing CO3303 Week
Computing & Information Sciences Kansas State University Lecture 12 of 42CIS 636/736: (Introduction to) Computer Graphics CIS 636/736 Computer Graphics.
Maths & Technologies for Games Graphics Optimisation - Batching CO3303 Week 5.
Computer Graphics Matrices
What are shaders? In the field of computer graphics, a shader is a computer program that runs on the graphics processing unit(GPU) and is used to do shading.
UW EXTENSION CERTIFICATE PROGRAM IN GAME DEVELOPMENT 2 ND QUARTER: ADVANCED GRAPHICS The GPU.
Computer Graphics Overview
- Introduction - Graphics Pipeline
Discrete Techniques.
Week 7 - Monday CS361.
Photorealistic Rendering vs. Interactive 3D Graphics
Graphics Processing Unit
Deferred Lighting.
3D Graphics Rendering PPT By Ricardo Veguilla.
The Graphics Rendering Pipeline
CS451Real-time Rendering Pipeline
Computer Graphics Module Review
Computer Graphics Module Overview
Computer Graphics Introduction to Shaders
Computer Graphics Material Colours and Lighting
Presentation transcript:

Computer Graphics Module Review CO2409 Computer Graphics

Lecture Contents 1.2D Graphics & Geometry 2.3D Geometry & Maths 3.Rendering Pipeline – Key Concepts 4.Programmable Pipeline / Shaders 5.Depth / Stencil Buffers & Shadows 6.Animation

Pixels & Colour A computer display is made of a grid of small rectangular areas called pixels Pixel colour is usually specified as red, green and blue components: –Integers (0-255) or floats (0.0 to 1.0) The RGB ‘colour space’ is a cube Another colour space is HLS –Hue =colour from spectrum, Lightness = brightness of the colour Saturation = intensity of the colour –Can be pictured as a double cone –More intuitive for artists

Bitmaps / Sprites / Alpha Channels A bitmap is rectangle of pixels stored off-screen for use in an image A sprite describes a particular use of bitmaps –When used as distinct elements in a larger scene As well as RGB colours, we can store per-pixel values specifying transparency –Alpha data, or the alpha channel, making RGBA Can use alpha to blend pixels onto viewport FinalColour = Alpha * SourceColour + (1-Alpha) * ViewportColour Alpha can also be used for alpha testing –Used for cutout sprites

Further Blending Other ways of blending pixels onto the viewport: Multiplicative blending equation is: FinalColour = SourceColour * ViewportColour A darkening effect, suitable for representation of glass, shadows, smoke etc Additive blending equation is: FinalColour = SourceColour + ViewportColour This is a lightening effect, mainly used for the representation of lights

Basic Geometry Definitions In both 2D and 3D we have used these definitions: –A vertex: a single point defined by coordinates on the axes of a coordinate system E.g A(10, 20) –An edge: a straight line joining two vertices E.g. AB –A vector: a movement within a coordinate system E.g. AB (from A to B) or V(5, 10) A normal is any vector whose length is 1 –A polygon: a single closed loop of edges E.g. ABC (from C to A implied)

Coordinate Systems A coordinate system is a particular set of choices for: –The location of the origin –Orientation and scale of the axes We later called this a space –E.g. Model space, World space A vertex will have different coordinates in two different coordinate systems The viewport is a particular coordinate system that corresponds to the visible display area

Rendering Rendering is converting geometry into pixels In general rendering is a two stage process: –Convert the geometry into 2D viewport space (geometry transformation / vertex shaders) –Set the colour of the pixels corresponding to this converted geometry (rasterising / pixel shaders) Looked briefly at 2D rendering: –Render 2D lines by stepping through the pixels –Render polygons with multiple lines –Render circles with equation + many lines –Detail about filling polygon pixels is beyond scope of module

Maths/C++ for Graphics Apps Be aware of numeric limitations in C++, e.g: –int limits can be exceeded –float / double have limited precision –Repeated calculations with float can build up errors –Other languages have similar limitations C++ automatically converts between numeric types, issuing warnings when it does –Don’t ignore, may not be what is required Several math functions used for graphics: –Max, min, remainders, modulus / absolute value, powers, cos, sin… –Know the library functions used

3D Geometry - Meshes / Normals A mesh is a set of polygons making a 3D object A mesh is usually defined together with a set of face and/or vertex normals A face normal is a normalised vector at right angles to a polygon –Used to specify the plane of the polygon –But not especially common A vertex normal can be average of all the face normals of the polygons containing that vertex –Or can have multiple for sharp edges –Used frequently, most notably for lighting

Matrices A matrix (plural matrices) is a rectangular table of numbers: They have special rules of arithmetic A coordinate system matrix is used to represent a model’s position/orientation: Transformation matrices used to convert between spaces, or move/orient models: –E.g. world matrix converts from model->world space –Basic transforms: translate, rotation, scale Of central importance to 3D graphics –Will always be exam questions on matrices

DirectX / Rendering Pipeline Graphics APIs perform a ‘pipeline’ of operations: –This is the DirectX 10 pipeline: Input is 3D model geometry data –Geometry stored as lists of vertex data –Customised for different techniques Output is viewport pixels Pipeline process: –Convert to world then viewport space, applying lighting –Scan through resultant 2D polygons, one pixel at a time –Render pixels using light colours, textures and blending

World Matrix Mesh vertices are stored in model space –The local space for the mesh with a convenient origin and orientation For each model we store a world matrix that transforms the model geometry into world space This defines the position and orientation of the model Has a special form containing the local axes of the model

View Matrix For each camera we store a view matrix that transforms the world space geometry into camera space Camera space defines the world as seen by the camera –X right, Y up and Z in the direction it is facing For technical reasons this matrix is actually the inverse of a normal world matrix –But it has a similar form and can be used in a similar way

Projection Matrix Cameras also have a second matrix, the projection matrix Defining how the camera space geometry is projected into 2D –It includes the field of view and viewport distance of the camera –Viewport distance is also called the near clip distance This matrix flattens camera space geometry into 2D viewport space –Then the projected 2D geometry is scaled/offset into pixel coordinates

Lighting / Shading Geometry colour can come from: –Vertex or face colours and/or dynamic lighting Two shading modes used –Hard or smooth edges 3 basic light types –Directional, Point and Spot Lighting is calculated by combining: –Ambient light – global background illumination –Diffuse light – direct lighting –Specular light – reflection of light source (highlights) –Equations are often examined

Textures A texture is a bitmap wrapped around a model The wrapping is specified by assigning a texture coordinate (UV) to each vertex in the geometry –This is texture mapping The UVs for the texture range from 0-1 –UVs outside this range will be wrapped, mirrored, etc. depending on the texture addressing mode Each pixel in the bitmap appears as a square on the geometry called a texel Textures and texels can be smoothed using texture filtering and mip-mapping

Vertex / Index Data Customised vertex data is stored in vertex buffers –Coordinate (always), normal, UVs, colour etc. Can use vertex data alone to store geometry –Each triplet of vertices is a triangle (triangle list) More efficient to use an additional index buffer –Store the set of unique vertices only –Define the triangles using triplets of indices Can also use triangle strips: –First triplet defines the first triangle –Each further vertex/index is used with the previous two to form a further triangle

Programmable Pipeline / Shaders Three pipeline stages can be programmed directly: –The vertex, geometry and pixel processing stages We did not look at programmable geometry processing Programs usually written in a high- level language (HLSL) –Called shaders, the vertex shader and the pixel shader We write a shader for every rendering technique needed –Shaders can be compiled at runtime and are loaded as needed

Vertex Shaders We define shaders in terms of: –Inputs - from previous stages –Outputs - to later stages Shaders have a “typical ” usage, but are actually very flexible Vertex shaders operate on each vertex in the original 3D geometry. Their typical usage is to: –Transform and project the vertex into viewport space –Perhaps apply animation or lighting to the vertex At a minimum, a vertex shader expects vertex coordinates as input, but may have other input too: –Normals, UVs, vertex colours, etc. A vertex shader must at the very least output a viewport position for the current vertex, although they often output much more

Pixel Shaders Pixel Shaders operate on each pixel in the final 2D polygons. Their typical usage is to: –Sample (and filter) any textures applied to the polygon –Combine the texture colour with the existing polygon colour (from lighting and/or geometry colours) Input for a pixel shader is usually the output from the vertex shader After rasterization A pixel shader must at least output a final pixel colour to be drawn/blended with the viewport A Random Image

Advanced Shaders Advanced shaders can be used to implement high-quality lighting and rendering techniques A key technique is per-pixel lighting –Vertex lighting exhibits problems on large polygons Instead, have the vertex shader pass the vertex position and normal on to the pixel shader These are interpolated for the pixel shader, which the uses the normal lighting equations on them Have covered several other shader techniques: –Specular mapping, normal mapping, parallax mapping, cell shading

Graphics Architecture The basic graphics architecture for all modern PCs and game consoles is similar Comprised of a main system and a graphics unit –With one processor each (CPU & GPU) –Fast local RAM for each processor Interface between these two systems is often slow GPU is a dedicated graphics microprocessor –Much faster than a CPU for graphics related algorithms GPU runs at the same time as the CPU –These are concurrent systems

Depth Buffers – Z-Buffers A depth buffer is a rectangular array of depth values that matches the back-buffer Used to sort rendered primitives in depth order A z-buffer stores the viewport space Z coordinate of each pixel in the back buffer –Calculated per vertex (range 0.0 to 1.0) –Interpolated per-pixel Pixels are only rendered if their z value is less than the existing value in the z-buffer Z values are not distributed evenly –Different depth pixels can get same z value –Causes visual artefacts (z-fighting)

Stencil Buffers Test stencil values before writing each pixel Result determines whether to write to back-buffer and/or stencil Customisable tests - a very flexible system –We used it for a mirror: The stencil-buffer is a buffer of additional per-pixel values associated with the depth buffer –Usually 1 or 8 bits embedded in the depth values A mask controlling drawing to the back-buffer

Rendering to Textures Some special effects can be performed by rendering the scene into a texture –Rather than into the back-buffer/viewport Process needs two (or more) rendering passes Set up a special render target texture and render the scene onto it Then render the scene again normally (to the back buffer), but with some polygons using the render texture Quality is limited by texture size

Shadow Techniques Basic shadows (e.g. blob-shadows) easy and useful Advanced techniques may be static or dynamic Static shadow maps are pre-calculated darkening textures applied over the main model textures Use high-quality (slow) techniques offline to generate these textures –Called baking the shadows Also sample static lighting at points where dynamic models will be –Can help light them at run-time

Dynamic Shadow Mapping Dynamic Shadow Mapping is an extension of render-to texture techniques used for shadows The scene is rendered into a texture (a shadow map), but from the point of view of the light –Treat the light like a camera Then the scene is rendered normally, but each pixel first tested against shadow map –The pixel is not lit if in shadow from the light Spotlights straightforward, point / directional complex

Rigid Body Animation Rigid body animation concerns models made of several distinct parts –We assume that the parts form a hierarchy Each part in the hierarchy has a world matrix –Defining its position and orientation - just like a model Each part’s matrix is stored relative to its parent –And defines the joint with the parent This is called a Matrix or Transform Hierarchy Can be rendered using a recursive process Or the parts can be stored in a depth-first list and rendered using an iterative process and a stack

Soft Body Animation Soft body animation or skinning concerns models that stretch and flex as they animate We define an independent hierarchy of bones assumed to underlie the geometry – the skeleton Again each bone has a parent-relative world matrix Each vertex can be influenced by more than one bone Each influence carries a weight (0-1) –Sum of the weights for each vertex is 1 Linearly blend the vertex world position from each bone influence using weights –A weighted average of the bone influences